Apr 17 16:31:36.647143 ip-10-0-141-239 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:37.068110 ip-10-0-141-239 kubenswrapper[2548]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:37.068110 ip-10-0-141-239 kubenswrapper[2548]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:37.068110 ip-10-0-141-239 kubenswrapper[2548]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:37.068110 ip-10-0-141-239 kubenswrapper[2548]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:37.068110 ip-10-0-141-239 kubenswrapper[2548]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:37.069250 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.068769 2548 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:37.072708 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072692 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:37.072708 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072708 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072711 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072715 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072717 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072720 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072722 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072725 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072728 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072731 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072734 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072736 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072739 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072741 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072744 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072746 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072749 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072751 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072754 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072756 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072763 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:37.072775 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072766 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072768 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072771 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072773 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072776 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072779 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072781 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072784 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072787 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072790 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072792 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072795 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072797 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072800 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072803 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072805 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072807 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072813 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072816 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:37.073275 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072819 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072821 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072824 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072826 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072828 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072831 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072833 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072835 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072838 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072840 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072842 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072845 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072848 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072851 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072854 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072857 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072859 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072862 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072864 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072867 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:37.073731 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072870 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072872 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072875 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072877 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072880 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072882 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072885 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072888 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072890 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072908 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072912 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072916 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072920 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072922 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072925 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072928 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072930 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072935 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072939 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:37.074258 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072943 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072945 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072948 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072950 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072953 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072956 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.072958 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073343 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073348 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073352 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073354 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073357 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073360 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073362 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073365 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073368 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073370 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073373 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073375 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073379 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:37.074704 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073381 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073384 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073386 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073389 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073391 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073394 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073397 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073401 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073404 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073407 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073409 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073412 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073414 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073417 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073420 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073422 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073424 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073427 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073430 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:37.075260 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073432 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073435 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073438 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073440 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073443 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073445 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073447 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073450 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073452 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073455 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073457 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073460 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073462 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073465 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073467 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073470 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073472 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073475 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073477 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073480 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:37.075728 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073482 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073485 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073488 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073490 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073492 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073495 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073497 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073499 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073502 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073504 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073507 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073510 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073513 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073515 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073518 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073520 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073522 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073525 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073527 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073530 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:37.076259 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073532 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073534 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073537 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073540 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073542 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073545 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073548 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073550 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073553 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073557 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073560 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073564 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073567 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.073570 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074712 2548 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074720 2548 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074727 2548 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074731 2548 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074736 2548 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074739 2548 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074743 2548 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:37.076740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074747 2548 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074750 2548 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074753 2548 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074757 2548 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074760 2548 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074763 2548 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074766 2548 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074769 2548 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074772 2548 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074775 2548 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074778 2548 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074781 2548 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074785 2548 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074787 2548 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074791 2548 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074793 2548 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074796 2548 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074805 2548 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074809 2548 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074812 2548 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074815 2548 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074818 2548 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074821 2548 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074824 2548 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074827 2548 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:37.077286 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074830 2548 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074834 2548 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074837 2548 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074840 2548 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074843 2548 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074846 2548 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074849 2548 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074854 2548 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074857 2548 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074860 2548 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074864 2548 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074867 2548 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074870 2548 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074873 2548 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074876 2548 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074879 2548 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074882 2548 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074885 2548 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074887 2548 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074890 2548 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074906 2548 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074909 2548 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074912 2548 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074916 2548 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074919 2548 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:37.077883 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074922 2548 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074926 2548 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074929 2548 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074932 2548 flags.go:64] FLAG: --help="false" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074935 2548 flags.go:64] FLAG: --hostname-override="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074938 2548 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074941 2548 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074944 2548 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074947 2548 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074950 2548 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074953 2548 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074955 2548 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074958 2548 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074961 2548 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074964 2548 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074967 2548 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074970 2548 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074973 2548 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074976 2548 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074979 2548 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074982 2548 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074985 2548 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074987 2548 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074990 2548 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:37.078497 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074993 2548 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.074998 2548 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075001 2548 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075004 2548 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075006 2548 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075009 2548 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075013 2548 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075016 2548 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075018 2548 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075022 2548 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075031 2548 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075036 2548 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075039 2548 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075042 2548 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075045 2548 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075047 2548 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075050 2548 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075053 2548 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075056 2548 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075063 2548 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075066 2548 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075069 2548 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075072 2548 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:37.079081 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075075 2548 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075080 2548 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075083 2548 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075086 2548 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075089 2548 flags.go:64] FLAG: --port="10250" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075092 2548 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075095 2548 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a5da24c672b68550" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075098 2548 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075101 2548 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075104 2548 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075107 2548 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075109 2548 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075113 2548 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075116 2548 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075119 2548 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075121 2548 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075125 2548 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075128 2548 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075131 2548 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075133 2548 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075136 2548 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075139 2548 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075142 2548 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075144 2548 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075147 2548 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075150 2548 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:37.079625 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075153 2548 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075156 2548 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075159 2548 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075162 2548 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075164 2548 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075167 2548 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075170 2548 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075172 2548 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075175 2548 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075178 2548 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075183 2548 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075186 2548 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075189 2548 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075192 2548 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075195 2548 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075198 2548 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075200 2548 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075203 2548 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075206 2548 flags.go:64] FLAG: --v="2" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075210 2548 flags.go:64] FLAG: --version="false" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075214 2548 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075218 2548 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.075222 2548 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075319 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:37.080273 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075322 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075326 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075329 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075337 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075340 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075343 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075346 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075348 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075351 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075354 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075356 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075359 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075361 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075363 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075366 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075370 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075373 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075376 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075378 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075381 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:37.080852 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075383 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075386 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075388 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075391 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075393 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075396 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075398 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075401 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075403 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075405 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075408 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075410 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075413 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075417 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075421 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075425 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075434 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075437 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075439 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:37.081412 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075442 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075444 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075447 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075449 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075452 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075454 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075456 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075459 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075462 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075465 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075468 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075470 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075472 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075475 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075477 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075480 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075483 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075485 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075488 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:37.081882 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075490 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075493 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075495 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075498 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075500 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075502 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075505 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075507 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075509 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075512 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075514 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075517 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075520 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075522 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075525 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075527 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075529 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075532 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075534 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075537 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:37.082347 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075539 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075546 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075548 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075551 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075553 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075556 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.075558 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:37.082827 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.076153 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:37.083025 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.082985 2548 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:37.083025 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.083001 2548 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083050 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083056 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083059 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083062 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083066 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083070 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083073 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083076 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083079 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:37.083078 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083081 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083084 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083087 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083090 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083093 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083095 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083098 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083101 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083103 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083106 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083109 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083111 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083114 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083116 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083119 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083122 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083124 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083127 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083129 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083132 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:37.083324 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083134 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083137 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083140 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083143 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083145 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083147 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083150 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083152 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083155 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083157 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083159 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083162 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083165 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083167 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083171 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083173 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083176 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083178 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083180 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:37.083821 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083183 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083185 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083188 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083190 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083193 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083195 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083198 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083200 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083202 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083204 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083207 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083209 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083212 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083214 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083217 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083220 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083229 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083234 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083238 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083241 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:37.084312 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083244 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083247 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083249 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083251 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083254 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083256 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083259 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083262 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083265 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083267 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083270 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083272 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083275 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083277 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083279 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083282 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083284 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:37.084798 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083287 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.083292 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083429 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083437 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083440 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083444 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083448 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083451 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083453 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083457 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083459 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083462 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083471 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083473 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083476 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:37.085270 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083479 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083481 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083484 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083486 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083489 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083491 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083494 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083496 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083499 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083501 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083504 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083506 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083508 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083511 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083513 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083516 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083518 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083521 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083523 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:37.085643 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083526 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083528 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083531 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083533 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083535 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083538 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083540 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083542 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083545 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083547 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083550 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083558 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083561 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083564 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083566 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083570 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083573 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083576 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083578 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083581 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:37.086154 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083583 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083586 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083588 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083591 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083593 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083596 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083598 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083600 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083608 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083611 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083613 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083616 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083618 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083621 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083623 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083625 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083628 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083630 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083633 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083635 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:37.086644 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083637 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083640 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083642 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083645 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083653 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083655 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083658 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083661 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083663 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083665 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083668 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083670 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083672 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:37.083675 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.083680 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:37.087137 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.084263 2548 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:37.087519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.086352 2548 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:37.087519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.087288 2548 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:37.087519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.087384 2548 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:37.087519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.087419 2548 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:37.115504 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.115479 2548 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:37.117970 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.117939 2548 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:37.130241 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.130220 2548 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:37.135799 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.135784 2548 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:37.139322 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.139307 2548 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:37.142443 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.142417 2548 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 84532409-a425-4667-a620-13907c5608cd:/dev/nvme0n1p4 d7c30f4b-1b58-475f-a563-a6bfdf6384fb:/dev/nvme0n1p3] Apr 17 16:31:37.142509 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.142443 2548 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:37.144169 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.144031 2548 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:37.148786 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.148675 2548 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:37.146410798 +0000 UTC m=+0.384926571 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100107 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2969f4c62db17db28a2fcd77f26393 SystemUUID:ec2969f4-c62d-b17d-b28a-2fcd77f26393 BootID:ba0125ad-d4a0-4b76-8b75-2a65e515bac9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:1b:33:18:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:1b:33:18:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:f7:ec:e4:98:6d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:37.148786 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.148786 2548 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:37.148890 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.148864 2548 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:37.151140 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.151117 2548 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:37.151279 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.151143 2548 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-239.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:37.151322 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.151290 2548 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:37.151322 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.151297 2548 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:37.151322 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.151310 2548 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:37.152171 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.152161 2548 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:37.152951 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.152942 2548 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:37.153068 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.153059 2548 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:37.155438 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.155429 2548 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:37.155475 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.155442 2548 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:37.155475 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.155461 2548 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:37.155475 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.155471 2548 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:37.155559 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.155482 2548 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:37.156456 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.156445 2548 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:37.156515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.156462 2548 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:37.161534 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.161512 2548 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:37.163367 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.163320 2548 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:37.165075 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165062 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165081 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165088 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165093 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165098 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165104 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165109 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165115 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165122 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165128 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:37.165134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165137 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:37.165400 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165146 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:37.165983 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165972 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:37.166013 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.165984 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:37.166960 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.166939 2548 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpdhk" Apr 17 16:31:37.169165 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.169145 2548 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:37.169239 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.169159 2548 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:37.169428 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169414 2548 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-239.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:37.169573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169562 2548 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:37.169607 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169597 2548 server.go:1295] "Started kubelet" Apr 17 16:31:37.169720 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169694 2548 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:37.169802 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169752 2548 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:37.169843 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.169826 2548 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:37.170466 ip-10-0-141-239 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:37.170870 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.170792 2548 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:37.172282 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.172267 2548 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:37.173280 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.173255 2548 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpdhk" Apr 17 16:31:37.178502 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.178483 2548 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:37.179754 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.179737 2548 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:37.180359 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.180345 2548 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:37.181007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.180990 2548 factory.go:55] Registering systemd factory Apr 17 16:31:37.181087 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181059 2548 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:37.181087 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181069 2548 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:37.181087 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181072 2548 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:37.181236 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181095 2548 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:37.181236 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.181153 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.181236 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181215 2548 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:37.181236 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181224 2548 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:37.181592 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181568 2548 factory.go:153] Registering CRI-O factory Apr 17 16:31:37.181666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181598 2548 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:37.181666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181646 2548 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:37.181767 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181670 2548 factory.go:103] Registering Raw factory Apr 17 16:31:37.181767 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.181683 2548 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:37.182137 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.182120 2548 manager.go:319] Starting recovery of all containers Apr 17 16:31:37.185329 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.185308 2548 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:37.189455 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.189414 2548 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-239.ec2.internal\" not found" node="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.198647 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.198630 2548 manager.go:324] Recovery completed Apr 17 16:31:37.202402 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.202385 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.204682 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.204667 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.204741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.204697 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.204741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.204710 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.205160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.205146 2548 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:37.205160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.205159 2548 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:37.205276 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.205178 2548 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:37.207634 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.207621 2548 policy_none.go:49] "None policy: Start" Apr 17 16:31:37.207669 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.207638 2548 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:37.207669 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.207647 2548 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:37.255473 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.255457 2548 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.255521 2548 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.255537 2548 server.go:85] "Starting device plugin registration server" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.255754 2548 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.255764 2548 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.255927 2548 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.256038 2548 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.256071 2548 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.256403 2548 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:37.265292 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.256440 2548 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.319417 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.319364 2548 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:37.320702 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.320687 2548 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:37.320806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.320715 2548 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:37.320806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.320732 2548 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:37.320806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.320741 2548 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:37.320806 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.320778 2548 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:37.323039 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.323019 2548 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:37.356872 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.356858 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.357916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.357876 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.358007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.357926 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.358007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.357941 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.358007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.357974 2548 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.366442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.366427 2548 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.366518 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.366448 2548 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-239.ec2.internal\": node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.394246 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.394223 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.421269 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.421233 2548 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal"] Apr 17 16:31:37.421348 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.421311 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.422100 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.422085 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.422179 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.422118 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.422179 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.422132 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.423421 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.423406 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.423550 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.423536 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.423596 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.423563 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.424084 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424064 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.424084 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424078 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.424212 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424089 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.424212 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424098 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.424212 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424099 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.424212 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.424113 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.425663 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.425645 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.425722 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.425680 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:37.426305 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.426292 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:37.426379 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.426315 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:37.426379 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.426325 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:37.452448 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.452432 2548 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-239.ec2.internal\" not found" node="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.456618 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.456603 2548 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-239.ec2.internal\" not found" node="ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.495160 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.495140 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.582629 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.582578 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.582629 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.582608 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1deed1599e86e9837e6b4d3fcce1e268-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"1deed1599e86e9837e6b4d3fcce1e268\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.582757 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.582627 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.595334 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.595311 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.682847 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682827 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1deed1599e86e9837e6b4d3fcce1e268-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"1deed1599e86e9837e6b4d3fcce1e268\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.682988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682854 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.682988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682875 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.682988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682920 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.682988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682943 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1deed1599e86e9837e6b4d3fcce1e268-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"1deed1599e86e9837e6b4d3fcce1e268\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.682988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.682961 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a563df7ee7058d21a512abceee773bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"a563df7ee7058d21a512abceee773bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.695944 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.695925 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.754134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.754107 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.759701 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:37.759684 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:37.796637 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.796605 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.897197 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.897139 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:37.997606 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:37.997585 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:38.087240 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.087218 2548 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:38.087832 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.087354 2548 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:38.087832 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.087377 2548 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:38.098373 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.098345 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 17 16:31:38.154160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.154091 2548 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:38.156018 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.156000 2548 apiserver.go:52] "Watching apiserver" Apr 17 16:31:38.164032 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.164011 2548 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:38.165727 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.165705 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-f29ht","openshift-network-diagnostics/network-check-target-29tlc","openshift-ovn-kubernetes/ovnkube-node-lq8np","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc","openshift-multus/multus-additional-cni-plugins-2qscf","openshift-multus/network-metrics-daemon-zsg2s","openshift-network-operator/iptables-alerter-csqpf","kube-system/konnectivity-agent-g48dw","openshift-cluster-node-tuning-operator/tuned-x2tmr","openshift-image-registry/node-ca-p555x"] Apr 17 16:31:38.168469 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.168444 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.168639 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.168613 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:38.168750 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.168728 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:38.170135 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.170120 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.171194 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.171175 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:38.171887 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.171649 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.174090 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.173602 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.174815 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.174630 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.174815 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.174767 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:38.174994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.174961 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:38.175127 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175111 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.175192 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175134 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:38.175239 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175198 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:38.175291 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175273 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8p95g\"" Apr 17 16:31:38.175291 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175283 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.175388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175307 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.175388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175368 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:38.175458 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175385 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.175458 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175389 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.175551 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.175491 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:38.175658 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175644 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:38.175752 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175735 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.175805 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175768 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:38.175805 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175736 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-thqqx\"" Apr 17 16:31:38.175976 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175836 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-85c7f\"" Apr 17 16:31:38.175976 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.175879 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-krbhr\"" Apr 17 16:31:38.176084 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.176069 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:38.176595 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.176537 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:37 +0000 UTC" deadline="2027-10-26 06:02:59.515874521 +0000 UTC" Apr 17 16:31:38.176652 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.176594 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13357h31m21.339283207s" Apr 17 16:31:38.176703 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.176673 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.177880 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.177862 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.178534 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.178514 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.178616 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.178594 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:38.178824 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.178811 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.179192 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.179177 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.179579 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.179561 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pfgg7\"" Apr 17 16:31:38.180189 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.180176 2548 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:38.180241 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.180221 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.180489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.180474 2548 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 17 16:31:38.180956 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.180914 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:38.182126 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.182113 2548 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:38.182722 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.182701 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xjsmz\"" Apr 17 16:31:38.182799 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.182705 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:38.183009 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.182983 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7vn4z\"" Apr 17 16:31:38.183009 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.182996 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.183151 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.183021 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.183208 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.183160 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:38.183388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.183369 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:38.184613 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.184597 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:38.184613 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.184597 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tf2s\"" Apr 17 16:31:38.185236 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185212 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-etc-selinux\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.185303 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185247 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mth\" (UniqueName: \"kubernetes.io/projected/e01b849c-d59b-4646-b565-976c52d3c16b-kube-api-access-d4mth\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.185303 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185270 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-var-lib-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.185303 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185295 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5lqk\" (UniqueName: \"kubernetes.io/projected/60cbc498-937e-4f93-95af-294c0a8e7beb-kube-api-access-v5lqk\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.185452 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185361 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-daemon-config\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.185452 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185396 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-multus-certs\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.185452 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185423 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5htb\" (UniqueName: \"kubernetes.io/projected/db6e591a-0918-41c9-a16d-9999ecbf1df5-kube-api-access-r5htb\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.185582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185466 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-iptables-alerter-script\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.185582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185495 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-agent-certs\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.185582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185515 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/883f4572-082b-45cf-809b-87efb82fbb9c-host\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.185582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185546 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-kubelet\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.185582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185568 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-systemd-units\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185583 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4phps\" (UniqueName: \"kubernetes.io/projected/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-kube-api-access-4phps\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185602 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-konnectivity-ca\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185626 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-socket-dir-parent\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185645 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-run\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185663 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fplh\" (UniqueName: \"kubernetes.io/projected/a97f9be6-2d21-46a6-95a1-50608634459b-kube-api-access-7fplh\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185704 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-bin\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185721 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-binary-copy\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.185745 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185745 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-var-lib-kubelet\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186031 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185760 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-tmp\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186031 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185794 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/883f4572-082b-45cf-809b-87efb82fbb9c-serviceca\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.186031 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185843 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186031 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.185876 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-netd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186120 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-script-lib\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186148 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-cni-binary-copy\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186171 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-kubernetes\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186191 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-systemd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186277 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-slash\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186327 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186387 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-cnibin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186425 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-k8s-cni-cncf-io\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186461 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-modprobe-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186485 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-etc-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186507 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-log-socket\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186582 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186543 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-bin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186577 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysconfig\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186614 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-config\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186638 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-env-overrides\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186655 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-etc-kubernetes\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186675 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186690 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-sys\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186704 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186719 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-host-slash\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186773 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-lib-modules\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186814 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-etc-tuned\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.186849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186842 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186863 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186879 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-hostroot\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186940 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-system-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186960 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.186991 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-ovn\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187018 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-node-log\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187037 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-multus\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187052 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-registration-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187088 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88sml\" (UniqueName: \"kubernetes.io/projected/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-kube-api-access-88sml\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187104 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-system-cni-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187125 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187146 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-cnibin\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187175 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187227 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187254 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-conf\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187295 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-kubelet\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187320 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-netns\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187341 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-sys-fs\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187362 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-device-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187385 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-host\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187409 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovn-node-metrics-cert\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187432 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-os-release\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187454 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187477 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-socket-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187498 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-systemd\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187520 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgz8\" (UniqueName: \"kubernetes.io/projected/883f4572-082b-45cf-809b-87efb82fbb9c-kube-api-access-cwgz8\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187543 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-netns\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187567 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2wl\" (UniqueName: \"kubernetes.io/projected/c756a090-293e-4944-9021-f8de796a8b45-kube-api-access-ww2wl\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187590 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-os-release\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.187643 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.187621 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-conf-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.195376 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.195361 2548 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:38.195504 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.195424 2548 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 17 16:31:38.195679 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.195659 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal"] Apr 17 16:31:38.199750 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.199731 2548 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:38.210439 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.210423 2548 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:38.210552 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.210537 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal"] Apr 17 16:31:38.218160 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.218072 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1deed1599e86e9837e6b4d3fcce1e268.slice/crio-b9320cc7b68213fb02169db64e190ed6ca07f91da4347c49c3384d542d73d693 WatchSource:0}: Error finding container b9320cc7b68213fb02169db64e190ed6ca07f91da4347c49c3384d542d73d693: Status 404 returned error can't find the container with id b9320cc7b68213fb02169db64e190ed6ca07f91da4347c49c3384d542d73d693 Apr 17 16:31:38.223592 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.223574 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:38.226183 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.226106 2548 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ds2dj" Apr 17 16:31:38.238839 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.238824 2548 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ds2dj" Apr 17 16:31:38.258745 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.258726 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda563df7ee7058d21a512abceee773bee.slice/crio-475e9683d61aa39f89888c3faa81d0a374330015c7becf8a69c1fe5a3c701e93 WatchSource:0}: Error finding container 475e9683d61aa39f89888c3faa81d0a374330015c7becf8a69c1fe5a3c701e93: Status 404 returned error can't find the container with id 475e9683d61aa39f89888c3faa81d0a374330015c7becf8a69c1fe5a3c701e93 Apr 17 16:31:38.287939 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.287916 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-bin\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.287943 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-binary-copy\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.287960 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-var-lib-kubelet\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.287974 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-tmp\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.287989 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/883f4572-082b-45cf-809b-87efb82fbb9c-serviceca\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288003 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288030 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288010 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-bin\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288042 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-netd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288069 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-script-lib\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288107 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288113 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-cni-binary-copy\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288138 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-kubernetes\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288142 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-cni-netd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288191 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-kubernetes\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288085 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-var-lib-kubelet\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288229 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-systemd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288288 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-slash\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288310 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288334 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-cnibin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288353 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-systemd\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288359 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-k8s-cni-cncf-io\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288385 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-slash\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288331 2548 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288417 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-modprobe-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288443 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-etc-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288391 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-k8s-cni-cncf-io\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288470 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/883f4572-082b-45cf-809b-87efb82fbb9c-serviceca\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288485 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-log-socket\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288502 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288508 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-bin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288446 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-cnibin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288529 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-etc-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288529 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-binary-copy\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288532 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysconfig\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288571 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-log-socket\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.288811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288584 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-bin\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288606 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-modprobe-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288607 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-config\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288636 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysconfig\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288643 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-env-overrides\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288671 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-etc-kubernetes\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288695 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288698 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-cni-binary-copy\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288726 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-script-lib\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288722 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-sys\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288759 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-sys\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288760 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-etc-kubernetes\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288776 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288787 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288810 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-host-slash\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288825 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-lib-modules\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288842 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-etc-tuned\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.289661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288868 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288890 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-host-slash\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288916 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288949 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-hostroot\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288956 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-lib-modules\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288975 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-system-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.288999 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289002 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289025 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-ovn\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289049 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-node-log\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289053 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-system-cni-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289082 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-multus\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289090 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-node-log\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289108 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-registration-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289127 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289133 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88sml\" (UniqueName: \"kubernetes.io/projected/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-kube-api-access-88sml\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289128 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-env-overrides\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.290498 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289156 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-cni-multus\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289176 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-registration-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289197 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-hostroot\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289184 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-system-cni-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289206 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-system-cni-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289240 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289290 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-run-ovn\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289320 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovnkube-config\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289322 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-cnibin\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289356 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-cnibin\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289376 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289407 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.289418 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289431 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-conf\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289461 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-kubelet\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289526 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-netns\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289553 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-sys-fs\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289556 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-d\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.291389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289610 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-device-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289622 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-var-lib-kubelet\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289636 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-host\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289680 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovn-node-metrics-cert\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289706 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-os-release\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289726 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-sysctl-conf\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289729 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289747 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.289792 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:31:38.789745934 +0000 UTC m=+2.028261696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289843 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-netns\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289876 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c756a090-293e-4944-9021-f8de796a8b45-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289925 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-sys-fs\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289976 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-host\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289991 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-device-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.289989 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-os-release\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290037 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-socket-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290066 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-systemd\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290070 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c756a090-293e-4944-9021-f8de796a8b45-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290106 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgz8\" (UniqueName: \"kubernetes.io/projected/883f4572-082b-45cf-809b-87efb82fbb9c-kube-api-access-cwgz8\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290118 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-etc-systemd\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290132 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-netns\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290179 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2wl\" (UniqueName: \"kubernetes.io/projected/c756a090-293e-4944-9021-f8de796a8b45-kube-api-access-ww2wl\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290198 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-socket-dir\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290209 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-os-release\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290233 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-run-netns\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290235 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-conf-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290268 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-conf-dir\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290283 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-etc-selinux\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290309 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mth\" (UniqueName: \"kubernetes.io/projected/e01b849c-d59b-4646-b565-976c52d3c16b-kube-api-access-d4mth\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290332 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-var-lib-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290348 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e01b849c-d59b-4646-b565-976c52d3c16b-etc-selinux\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290384 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5lqk\" (UniqueName: \"kubernetes.io/projected/60cbc498-937e-4f93-95af-294c0a8e7beb-kube-api-access-v5lqk\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290409 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-daemon-config\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.292573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290448 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-multus-certs\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290474 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5htb\" (UniqueName: \"kubernetes.io/projected/db6e591a-0918-41c9-a16d-9999ecbf1df5-kube-api-access-r5htb\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290331 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-os-release\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290520 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-iptables-alerter-script\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290548 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-agent-certs\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290576 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-var-lib-openvswitch\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290573 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/883f4572-082b-45cf-809b-87efb82fbb9c-host\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290618 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-kubelet\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290644 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-systemd-units\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290694 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4phps\" (UniqueName: \"kubernetes.io/projected/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-kube-api-access-4phps\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290726 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-konnectivity-ca\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290786 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-socket-dir-parent\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290815 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-run\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290857 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fplh\" (UniqueName: \"kubernetes.io/projected/a97f9be6-2d21-46a6-95a1-50608634459b-kube-api-access-7fplh\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290916 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-host-run-multus-certs\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290976 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-socket-dir-parent\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.290975 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/883f4572-082b-45cf-809b-87efb82fbb9c-host\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291055 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-host-kubelet\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291056 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-systemd-units\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291122 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a97f9be6-2d21-46a6-95a1-50608634459b-run\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291167 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-iptables-alerter-script\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291447 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db6e591a-0918-41c9-a16d-9999ecbf1df5-multus-daemon-config\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291463 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-tmp\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291717 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a97f9be6-2d21-46a6-95a1-50608634459b-etc-tuned\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.291822 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-konnectivity-ca\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.292134 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-ovn-node-metrics-cert\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.293684 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.293269 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eaedeb89-807e-4759-a3fe-7ccfc919f4d7-agent-certs\") pod \"konnectivity-agent-g48dw\" (UID: \"eaedeb89-807e-4759-a3fe-7ccfc919f4d7\") " pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.306369 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.306352 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:38.306466 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.306373 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:38.306466 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.306388 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:38.306466 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.306440 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:38.806423976 +0000 UTC m=+2.044939735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:38.308441 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.308422 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgz8\" (UniqueName: \"kubernetes.io/projected/883f4572-082b-45cf-809b-87efb82fbb9c-kube-api-access-cwgz8\") pod \"node-ca-p555x\" (UID: \"883f4572-082b-45cf-809b-87efb82fbb9c\") " pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.308629 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.308612 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88sml\" (UniqueName: \"kubernetes.io/projected/6f8baf84-e2c1-4c17-bc5c-e068af8f6439-kube-api-access-88sml\") pod \"ovnkube-node-lq8np\" (UID: \"6f8baf84-e2c1-4c17-bc5c-e068af8f6439\") " pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.311364 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.311212 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fplh\" (UniqueName: \"kubernetes.io/projected/a97f9be6-2d21-46a6-95a1-50608634459b-kube-api-access-7fplh\") pod \"tuned-x2tmr\" (UID: \"a97f9be6-2d21-46a6-95a1-50608634459b\") " pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.311445 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.311248 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4phps\" (UniqueName: \"kubernetes.io/projected/bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90-kube-api-access-4phps\") pod \"iptables-alerter-csqpf\" (UID: \"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90\") " pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.311806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.311786 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5lqk\" (UniqueName: \"kubernetes.io/projected/60cbc498-937e-4f93-95af-294c0a8e7beb-kube-api-access-v5lqk\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.312036 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.312022 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5htb\" (UniqueName: \"kubernetes.io/projected/db6e591a-0918-41c9-a16d-9999ecbf1df5-kube-api-access-r5htb\") pod \"multus-f29ht\" (UID: \"db6e591a-0918-41c9-a16d-9999ecbf1df5\") " pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.312769 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.312743 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2wl\" (UniqueName: \"kubernetes.io/projected/c756a090-293e-4944-9021-f8de796a8b45-kube-api-access-ww2wl\") pod \"multus-additional-cni-plugins-2qscf\" (UID: \"c756a090-293e-4944-9021-f8de796a8b45\") " pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.312836 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.312775 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mth\" (UniqueName: \"kubernetes.io/projected/e01b849c-d59b-4646-b565-976c52d3c16b-kube-api-access-d4mth\") pod \"aws-ebs-csi-driver-node-tc7rc\" (UID: \"e01b849c-d59b-4646-b565-976c52d3c16b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.323822 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.323788 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"a563df7ee7058d21a512abceee773bee","Type":"ContainerStarted","Data":"475e9683d61aa39f89888c3faa81d0a374330015c7becf8a69c1fe5a3c701e93"} Apr 17 16:31:38.324653 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.324632 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" event={"ID":"1deed1599e86e9837e6b4d3fcce1e268","Type":"ContainerStarted","Data":"b9320cc7b68213fb02169db64e190ed6ca07f91da4347c49c3384d542d73d693"} Apr 17 16:31:38.407646 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.407590 2548 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:38.490741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.490714 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f29ht" Apr 17 16:31:38.497315 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.497282 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6e591a_0918_41c9_a16d_9999ecbf1df5.slice/crio-7d4a4bfe0989ae9ca8e9fb6f5ea0ec106c177dc6a19ac58f311dfa69e07646c6 WatchSource:0}: Error finding container 7d4a4bfe0989ae9ca8e9fb6f5ea0ec106c177dc6a19ac58f311dfa69e07646c6: Status 404 returned error can't find the container with id 7d4a4bfe0989ae9ca8e9fb6f5ea0ec106c177dc6a19ac58f311dfa69e07646c6 Apr 17 16:31:38.502320 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.502299 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:31:38.508149 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.508127 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8baf84_e2c1_4c17_bc5c_e068af8f6439.slice/crio-2cc054eea758ab88a998d5179c8bd9c085b815823cdbe7d6e967abb48393ced8 WatchSource:0}: Error finding container 2cc054eea758ab88a998d5179c8bd9c085b815823cdbe7d6e967abb48393ced8: Status 404 returned error can't find the container with id 2cc054eea758ab88a998d5179c8bd9c085b815823cdbe7d6e967abb48393ced8 Apr 17 16:31:38.528416 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.528397 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" Apr 17 16:31:38.532916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.532878 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2qscf" Apr 17 16:31:38.534059 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.534039 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01b849c_d59b_4646_b565_976c52d3c16b.slice/crio-b527069edb0536e8f187eb3af2b7d7d3bd55c0df74a64c7638853492710c7750 WatchSource:0}: Error finding container b527069edb0536e8f187eb3af2b7d7d3bd55c0df74a64c7638853492710c7750: Status 404 returned error can't find the container with id b527069edb0536e8f187eb3af2b7d7d3bd55c0df74a64c7638853492710c7750 Apr 17 16:31:38.538567 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.538546 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc756a090_293e_4944_9021_f8de796a8b45.slice/crio-507de8fbf7f74554e09811c80aef488cfc2bcc4d57a614706494b9182852b9e7 WatchSource:0}: Error finding container 507de8fbf7f74554e09811c80aef488cfc2bcc4d57a614706494b9182852b9e7: Status 404 returned error can't find the container with id 507de8fbf7f74554e09811c80aef488cfc2bcc4d57a614706494b9182852b9e7 Apr 17 16:31:38.539153 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.539139 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-csqpf" Apr 17 16:31:38.544832 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.544815 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:38.545049 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.545024 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa866dd_f0dc_4c76_ac8b_1e2b8c5e7a90.slice/crio-d5a06123298540337985366090fcf0e2c098c4c05f3110deb98ef6bf29a0817a WatchSource:0}: Error finding container d5a06123298540337985366090fcf0e2c098c4c05f3110deb98ef6bf29a0817a: Status 404 returned error can't find the container with id d5a06123298540337985366090fcf0e2c098c4c05f3110deb98ef6bf29a0817a Apr 17 16:31:38.550222 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.550200 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaedeb89_807e_4759_a3fe_7ccfc919f4d7.slice/crio-39c899e3ab11782d19c2646b2809a03ffc8fbd9d524405050c0f1684c2649678 WatchSource:0}: Error finding container 39c899e3ab11782d19c2646b2809a03ffc8fbd9d524405050c0f1684c2649678: Status 404 returned error can't find the container with id 39c899e3ab11782d19c2646b2809a03ffc8fbd9d524405050c0f1684c2649678 Apr 17 16:31:38.550773 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.550758 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" Apr 17 16:31:38.556120 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.556106 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p555x" Apr 17 16:31:38.556867 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.556774 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda97f9be6_2d21_46a6_95a1_50608634459b.slice/crio-5f1be99ec1d2a27ac80128753a9db30031374e6bb951d8bbdeaf79973f311bb9 WatchSource:0}: Error finding container 5f1be99ec1d2a27ac80128753a9db30031374e6bb951d8bbdeaf79973f311bb9: Status 404 returned error can't find the container with id 5f1be99ec1d2a27ac80128753a9db30031374e6bb951d8bbdeaf79973f311bb9 Apr 17 16:31:38.563162 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:38.563133 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883f4572_082b_45cf_809b_87efb82fbb9c.slice/crio-036699c4785a23e1f8d42244ef83979ca4c2fbfdea2166673c4f5604c994562e WatchSource:0}: Error finding container 036699c4785a23e1f8d42244ef83979ca4c2fbfdea2166673c4f5604c994562e: Status 404 returned error can't find the container with id 036699c4785a23e1f8d42244ef83979ca4c2fbfdea2166673c4f5604c994562e Apr 17 16:31:38.794380 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.794257 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:38.794534 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.794443 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.794534 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.794522 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:31:39.794501508 +0000 UTC m=+3.033017276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.895319 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:38.895209 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:38.895501 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.895413 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:38.895501 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.895434 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:38.895501 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.895444 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:38.895501 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:38.895496 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:39.895481659 +0000 UTC m=+3.133997416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:39.237463 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.237382 2548 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:39.240334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.240298 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:38 +0000 UTC" deadline="2027-12-22 00:26:19.087444581 +0000 UTC" Apr 17 16:31:39.240448 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.240338 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14719h54m39.847110494s" Apr 17 16:31:39.327860 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.327821 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f29ht" event={"ID":"db6e591a-0918-41c9-a16d-9999ecbf1df5","Type":"ContainerStarted","Data":"7d4a4bfe0989ae9ca8e9fb6f5ea0ec106c177dc6a19ac58f311dfa69e07646c6"} Apr 17 16:31:39.335771 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.335737 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p555x" event={"ID":"883f4572-082b-45cf-809b-87efb82fbb9c","Type":"ContainerStarted","Data":"036699c4785a23e1f8d42244ef83979ca4c2fbfdea2166673c4f5604c994562e"} Apr 17 16:31:39.344873 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.344799 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" event={"ID":"a97f9be6-2d21-46a6-95a1-50608634459b","Type":"ContainerStarted","Data":"5f1be99ec1d2a27ac80128753a9db30031374e6bb951d8bbdeaf79973f311bb9"} Apr 17 16:31:39.347231 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.347167 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-csqpf" event={"ID":"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90","Type":"ContainerStarted","Data":"d5a06123298540337985366090fcf0e2c098c4c05f3110deb98ef6bf29a0817a"} Apr 17 16:31:39.350455 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.350353 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" event={"ID":"e01b849c-d59b-4646-b565-976c52d3c16b","Type":"ContainerStarted","Data":"b527069edb0536e8f187eb3af2b7d7d3bd55c0df74a64c7638853492710c7750"} Apr 17 16:31:39.357170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.357145 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"2cc054eea758ab88a998d5179c8bd9c085b815823cdbe7d6e967abb48393ced8"} Apr 17 16:31:39.358560 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.358540 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g48dw" event={"ID":"eaedeb89-807e-4759-a3fe-7ccfc919f4d7","Type":"ContainerStarted","Data":"39c899e3ab11782d19c2646b2809a03ffc8fbd9d524405050c0f1684c2649678"} Apr 17 16:31:39.362014 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.361990 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerStarted","Data":"507de8fbf7f74554e09811c80aef488cfc2bcc4d57a614706494b9182852b9e7"} Apr 17 16:31:39.687174 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.687085 2548 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:39.802379 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.802346 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:39.802578 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.802498 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:39.802578 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.802565 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.8025449 +0000 UTC m=+5.041060659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:39.902726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:39.902678 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:39.902936 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.902856 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:39.902936 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.902876 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:39.902936 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.902889 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:39.903108 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:39.903022 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.902992992 +0000 UTC m=+5.141508773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:40.241400 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:40.241327 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:38 +0000 UTC" deadline="2027-09-30 20:27:36.840043525 +0000 UTC" Apr 17 16:31:40.241400 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:40.241366 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12747h55m56.598681074s" Apr 17 16:31:40.322235 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:40.321696 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:40.322235 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:40.321829 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:40.322235 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:40.321865 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:40.322235 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:40.322014 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:41.816512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:41.816464 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:41.816963 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.816623 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:41.816963 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.816684 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.816665065 +0000 UTC m=+9.055180825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:41.917635 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:41.917587 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:41.917828 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.917783 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:41.917828 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.917803 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:41.917920 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.917838 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:41.917955 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:41.917918 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.917876296 +0000 UTC m=+9.156392066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:42.321227 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:42.321184 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:42.321394 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:42.321185 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:42.321394 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:42.321369 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:42.321513 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:42.321466 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:43.260671 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.260636 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cp2k7"] Apr 17 16:31:43.265592 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.265570 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.265702 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:43.265637 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:43.328913 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.328866 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-dbus\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.329065 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.328950 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-kubelet-config\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.329134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.329067 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.430394 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.430347 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-kubelet-config\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.430557 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.430439 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.430557 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.430482 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-dbus\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.430663 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.430643 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-dbus\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.430736 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.430705 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8405d132-1e05-4ddb-89bd-dcec490db483-kubelet-config\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.431100 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:43.430797 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:43.431100 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:43.430865 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:43.930843565 +0000 UTC m=+7.169359329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:43.935222 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:43.935142 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:43.935356 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:43.935277 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:43.935356 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:43.935347 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.935331483 +0000 UTC m=+8.173847240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:44.322171 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:44.321887 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:44.322171 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:44.321940 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:44.322171 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:44.322086 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:44.322171 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:44.322033 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:44.943289 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:44.943238 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:44.943455 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:44.943392 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:44.943455 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:44.943449 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.943435462 +0000 UTC m=+10.181951222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:45.322184 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.322100 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:45.322570 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.322231 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:45.539954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.539133 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h548j"] Apr 17 16:31:45.542632 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.541727 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.546383 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.546359 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:45.546675 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.546627 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25cmm\"" Apr 17 16:31:45.547261 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.547083 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:45.652013 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.651721 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mldp\" (UniqueName: \"kubernetes.io/projected/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-kube-api-access-2mldp\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.652013 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.651821 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-hosts-file\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.652013 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.651860 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-tmp-dir\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.753082 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.753046 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mldp\" (UniqueName: \"kubernetes.io/projected/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-kube-api-access-2mldp\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.753279 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.753142 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-hosts-file\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.753279 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.753175 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-tmp-dir\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.753545 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.753521 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-tmp-dir\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.753925 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.753866 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-hosts-file\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.765744 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.765719 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mldp\" (UniqueName: \"kubernetes.io/projected/c4bbbe54-5b68-47bf-99a9-c5e02ce391cd-kube-api-access-2mldp\") pod \"node-resolver-h548j\" (UID: \"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd\") " pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.854225 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.854178 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:45.854435 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.854401 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.854551 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.854471 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.854452404 +0000 UTC m=+17.092968166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.854930 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.854890 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h548j" Apr 17 16:31:45.955586 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:45.955503 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:45.955745 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.955702 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:45.955745 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.955722 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:45.955745 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.955733 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.955922 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:45.955792 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.955773217 +0000 UTC m=+17.194288996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:46.321753 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:46.321662 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:46.321959 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:46.321789 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:46.322257 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:46.322236 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:46.322616 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:46.322350 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:46.963388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:46.963345 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:46.963573 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:46.963458 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:46.963573 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:46.963539 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.963515613 +0000 UTC m=+14.202031375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:47.322388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:47.322316 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:47.322800 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:47.322432 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:48.321174 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:48.321133 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:48.321347 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:48.321137 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:48.321347 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:48.321283 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:48.321464 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:48.321365 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:49.321984 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:49.321947 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:49.322469 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:49.322078 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:50.321216 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:50.321174 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:50.321216 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:50.321208 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:50.321423 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:50.321303 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:50.321423 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:50.321402 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:50.994347 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:50.994307 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:50.994763 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:50.994480 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:50.994763 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:50.994554 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.994534743 +0000 UTC m=+22.233050500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:51.321222 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:51.321128 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:51.321374 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:51.321254 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:52.321662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:52.321632 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:52.321662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:52.321660 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:52.322195 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:52.321760 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:52.322195 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:52.321931 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:53.321944 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:53.321909 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:53.322343 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:53.322031 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:53.917806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:53.917780 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:53.918043 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:53.917959 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:53.918043 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:53.918034 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:32:09.918011304 +0000 UTC m=+33.156527072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:54.018741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:54.018711 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:54.018926 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.018856 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:54.018926 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.018872 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:54.018926 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.018884 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:54.019072 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.018952 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.018936267 +0000 UTC m=+33.257452027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:54.321214 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:54.321125 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:54.321214 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:54.321156 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:54.321430 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.321255 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:54.321430 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:54.321368 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:55.321647 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:55.321607 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:55.322070 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:55.321732 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:56.321527 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:56.321495 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:56.321691 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:56.321495 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:56.321691 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:56.321605 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:56.321691 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:56.321657 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:56.420711 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:31:56.420686 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4bbbe54_5b68_47bf_99a9_c5e02ce391cd.slice/crio-c8e4437ae44f3b2c5ee7b2f3945665bead68ab6097036dc0cc6811a58aaf4b70 WatchSource:0}: Error finding container c8e4437ae44f3b2c5ee7b2f3945665bead68ab6097036dc0cc6811a58aaf4b70: Status 404 returned error can't find the container with id c8e4437ae44f3b2c5ee7b2f3945665bead68ab6097036dc0cc6811a58aaf4b70 Apr 17 16:31:57.329719 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.329347 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:57.330465 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:57.329879 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:57.405975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.403443 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"7e8a5a42ed21cd60445355a3afa4465888f533df5320b34e50cc0504b5e3e7c0"} Apr 17 16:31:57.405975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.403488 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"f393d82e4aeea082829de2889aa7cf944db17b942da3b10959d7fef09fbe45c7"} Apr 17 16:31:57.405975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.403499 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"ec6d15eb77313735cfbd578e3e20b60a650d8ea18b1f3e5ff299a1e7d8acde65"} Apr 17 16:31:57.405975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.403508 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"4ff1109ed59293f97ed48cec6cecddd30eecb909c9b8cb18f5f6579e6faee65d"} Apr 17 16:31:57.405975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.403516 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"f518266aadee1e2469ef9ff24be1e76814bc6285e6b68184ee61a0c225a43353"} Apr 17 16:31:57.409060 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.409027 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" event={"ID":"1deed1599e86e9837e6b4d3fcce1e268","Type":"ContainerStarted","Data":"de7241a70842214b6bd05eda1032b346f3ef964d82f8f8d9b832f6cc2d1b5c08"} Apr 17 16:31:57.412850 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.412826 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f29ht" event={"ID":"db6e591a-0918-41c9-a16d-9999ecbf1df5","Type":"ContainerStarted","Data":"6d9ce218d16cc3fe6b422ff4df8adfe4617e79417882acf5549fee3c50a61ee8"} Apr 17 16:31:57.414933 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.414885 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h548j" event={"ID":"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd","Type":"ContainerStarted","Data":"c8e4437ae44f3b2c5ee7b2f3945665bead68ab6097036dc0cc6811a58aaf4b70"} Apr 17 16:31:57.416886 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.416517 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" event={"ID":"a97f9be6-2d21-46a6-95a1-50608634459b","Type":"ContainerStarted","Data":"f5ea8d813ac447e2b4672103f83b8bca5f3eaca122a7ff60bb8b709004881f25"} Apr 17 16:31:57.423300 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.423257 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" podStartSLOduration=19.423241519 podStartE2EDuration="19.423241519s" podCreationTimestamp="2026-04-17 16:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:57.422408665 +0000 UTC m=+20.660924444" watchObservedRunningTime="2026-04-17 16:31:57.423241519 +0000 UTC m=+20.661757297" Apr 17 16:31:57.437670 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.437634 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x2tmr" podStartSLOduration=2.550482077 podStartE2EDuration="20.437623592s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.558172451 +0000 UTC m=+1.796688208" lastFinishedPulling="2026-04-17 16:31:56.445313962 +0000 UTC m=+19.683829723" observedRunningTime="2026-04-17 16:31:57.437503394 +0000 UTC m=+20.676019367" watchObservedRunningTime="2026-04-17 16:31:57.437623592 +0000 UTC m=+20.676139371" Apr 17 16:31:57.454156 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:57.454118 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f29ht" podStartSLOduration=2.396662385 podStartE2EDuration="20.454106838s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.498871234 +0000 UTC m=+1.737386990" lastFinishedPulling="2026-04-17 16:31:56.556315674 +0000 UTC m=+19.794831443" observedRunningTime="2026-04-17 16:31:57.453799012 +0000 UTC m=+20.692314792" watchObservedRunningTime="2026-04-17 16:31:57.454106838 +0000 UTC m=+20.692622616" Apr 17 16:31:58.321927 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.321818 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:31:58.321927 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.321826 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:31:58.322139 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:58.321967 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:31:58.322139 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:58.322036 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:31:58.420181 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.419984 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h548j" event={"ID":"c4bbbe54-5b68-47bf-99a9-c5e02ce391cd","Type":"ContainerStarted","Data":"e546dcd86d72ae97ece7b904338d93a7bbaae21451e4607a3e247a2ddd06ac95"} Apr 17 16:31:58.422124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.421696 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p555x" event={"ID":"883f4572-082b-45cf-809b-87efb82fbb9c","Type":"ContainerStarted","Data":"986575bbac5cb27281da30202e443315ae4a9c4c62bb0fcd74c433108b8c7578"} Apr 17 16:31:58.423551 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.423524 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-csqpf" event={"ID":"bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90","Type":"ContainerStarted","Data":"3508c6cfcb9e9f83ed1772ef350d0aa389ad546f2d923bee464d845bc93b599c"} Apr 17 16:31:58.424749 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.424716 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" event={"ID":"e01b849c-d59b-4646-b565-976c52d3c16b","Type":"ContainerStarted","Data":"bba24bdf323b9f156d106dcf73a9592608861651b2a7ea73cd83f4988df0f010"} Apr 17 16:31:58.427526 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.427493 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"414e3fece6044c2f59caf9987956db4537b09c6e88c5ddb8676077dcf102ce85"} Apr 17 16:31:58.428815 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.428793 2548 generic.go:358] "Generic (PLEG): container finished" podID="a563df7ee7058d21a512abceee773bee" containerID="f4340e0115ff0ee1e8cf071ac2bf23de97bfae5aae4f38cc6f40b3c11b07fc6f" exitCode=0 Apr 17 16:31:58.428919 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.428868 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"a563df7ee7058d21a512abceee773bee","Type":"ContainerDied","Data":"f4340e0115ff0ee1e8cf071ac2bf23de97bfae5aae4f38cc6f40b3c11b07fc6f"} Apr 17 16:31:58.430282 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.430262 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g48dw" event={"ID":"eaedeb89-807e-4759-a3fe-7ccfc919f4d7","Type":"ContainerStarted","Data":"813fbb7af15016f8aa3a3156a5df55bfc5d0f994a97cf733d37c3a75e39e6ef4"} Apr 17 16:31:58.431558 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.431536 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="f79d8ea0fd347dc6945c832b68327c53b837f935d625744fe05a2099701ddfa2" exitCode=0 Apr 17 16:31:58.431660 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.431646 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"f79d8ea0fd347dc6945c832b68327c53b837f935d625744fe05a2099701ddfa2"} Apr 17 16:31:58.435100 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.435062 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h548j" podStartSLOduration=13.435050983 podStartE2EDuration="13.435050983s" podCreationTimestamp="2026-04-17 16:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:58.43495745 +0000 UTC m=+21.673473220" watchObservedRunningTime="2026-04-17 16:31:58.435050983 +0000 UTC m=+21.673566762" Apr 17 16:31:58.477223 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.477161 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g48dw" podStartSLOduration=3.579783714 podStartE2EDuration="21.477144087s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.551588249 +0000 UTC m=+1.790104010" lastFinishedPulling="2026-04-17 16:31:56.448948626 +0000 UTC m=+19.687464383" observedRunningTime="2026-04-17 16:31:58.452415227 +0000 UTC m=+21.690931008" watchObservedRunningTime="2026-04-17 16:31:58.477144087 +0000 UTC m=+21.715659867" Apr 17 16:31:58.508375 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.508335 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-csqpf" podStartSLOduration=3.667059395 podStartE2EDuration="21.50832344s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.547274786 +0000 UTC m=+1.785790547" lastFinishedPulling="2026-04-17 16:31:56.388538824 +0000 UTC m=+19.627054592" observedRunningTime="2026-04-17 16:31:58.492147559 +0000 UTC m=+21.730663350" watchObservedRunningTime="2026-04-17 16:31:58.50832344 +0000 UTC m=+21.746839213" Apr 17 16:31:58.508702 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.508669 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p555x" podStartSLOduration=3.6868522390000003 podStartE2EDuration="21.508660072s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.566732048 +0000 UTC m=+1.805247808" lastFinishedPulling="2026-04-17 16:31:56.38853987 +0000 UTC m=+19.627055641" observedRunningTime="2026-04-17 16:31:58.507854685 +0000 UTC m=+21.746370466" watchObservedRunningTime="2026-04-17 16:31:58.508660072 +0000 UTC m=+21.747175851" Apr 17 16:31:58.539489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:58.539470 2548 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:59.056072 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.056033 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:59.056388 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:59.056182 2548 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:59.056388 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:59.056238 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret podName:8405d132-1e05-4ddb-89bd-dcec490db483 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:15.056221515 +0000 UTC m=+38.294737271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret") pod "global-pull-secret-syncer-cp2k7" (UID: "8405d132-1e05-4ddb-89bd-dcec490db483") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:59.267318 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.267114 2548 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:58.539485812Z","UUID":"7f0e41c2-6e1f-44b8-b9d4-4e239bf410ce","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:59.269100 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.269078 2548 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:59.269100 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.269107 2548 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:59.324874 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.324806 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:31:59.325037 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:31:59.324930 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:31:59.435917 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.435859 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" event={"ID":"e01b849c-d59b-4646-b565-976c52d3c16b","Type":"ContainerStarted","Data":"5df8e206468dabebec00f1d68c3621c369e47985b08d9823f54053aadc8d3c70"} Apr 17 16:31:59.437590 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.437563 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"a563df7ee7058d21a512abceee773bee","Type":"ContainerStarted","Data":"b17846abbc1a0a899eabd8aa9fce9d55fda102222b79023c1acd9cdf53b9c285"} Apr 17 16:31:59.468960 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.468933 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:59.469587 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.469567 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:31:59.483681 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:31:59.483645 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" podStartSLOduration=21.483631148 podStartE2EDuration="21.483631148s" podCreationTimestamp="2026-04-17 16:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:59.452425409 +0000 UTC m=+22.690941192" watchObservedRunningTime="2026-04-17 16:31:59.483631148 +0000 UTC m=+22.722146929" Apr 17 16:32:00.321483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:00.321452 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:00.321661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:00.321453 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:00.321661 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:00.321585 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:00.321661 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:00.321638 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:00.441008 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:00.440965 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" event={"ID":"e01b849c-d59b-4646-b565-976c52d3c16b","Type":"ContainerStarted","Data":"89096c1a78b39cb793a5204ee9ca33d8bf56dfed3e624255b796e07da2fcd909"} Apr 17 16:32:00.444200 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:00.444163 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"32e70714f53fe6c898a031811212421d49b0e6108c4b6942a7f87af374af5388"} Apr 17 16:32:00.457501 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:00.457460 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tc7rc" podStartSLOduration=2.592992338 podStartE2EDuration="23.457447394s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.535884429 +0000 UTC m=+1.774400189" lastFinishedPulling="2026-04-17 16:31:59.400339473 +0000 UTC m=+22.638855245" observedRunningTime="2026-04-17 16:32:00.456955777 +0000 UTC m=+23.695471556" watchObservedRunningTime="2026-04-17 16:32:00.457447394 +0000 UTC m=+23.695963172" Apr 17 16:32:01.322025 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:01.321985 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:01.322192 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:01.322128 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:32:01.445937 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:01.445890 2548 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:32:02.321446 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.321325 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:02.321543 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.321349 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:02.321598 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:02.321568 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:02.321689 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:02.321671 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:02.451216 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.451185 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" event={"ID":"6f8baf84-e2c1-4c17-bc5c-e068af8f6439","Type":"ContainerStarted","Data":"778bd08a5d8da4e76461506f77e6cf688765e8c346d2fef94ed3fbd4bf025251"} Apr 17 16:32:02.451557 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.451509 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:02.451557 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.451532 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:02.451666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.451602 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:02.470250 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.470226 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:02.470690 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.470660 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:02.476954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.476910 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" podStartSLOduration=6.983086005 podStartE2EDuration="25.476881519s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.509647502 +0000 UTC m=+1.748163259" lastFinishedPulling="2026-04-17 16:31:57.003443016 +0000 UTC m=+20.241958773" observedRunningTime="2026-04-17 16:32:02.475677424 +0000 UTC m=+25.714193202" watchObservedRunningTime="2026-04-17 16:32:02.476881519 +0000 UTC m=+25.715397298" Apr 17 16:32:02.518941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.518914 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:32:02.519045 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.519035 2548 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:32:02.519634 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:02.519611 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g48dw" Apr 17 16:32:03.324072 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:03.324046 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:03.324254 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:03.324142 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:32:03.437805 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:03.437778 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h548j_c4bbbe54-5b68-47bf-99a9-c5e02ce391cd/dns-node-resolver/0.log" Apr 17 16:32:03.454460 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:03.454423 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="9c954485fb7a923363778cbc44771efe375ca61817b4635e9cb4322699a7fc94" exitCode=0 Apr 17 16:32:03.454940 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:03.454481 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"9c954485fb7a923363778cbc44771efe375ca61817b4635e9cb4322699a7fc94"} Apr 17 16:32:04.321945 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.321732 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:04.322059 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.321742 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:04.322059 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:04.321990 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:04.322143 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:04.322073 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:04.342042 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.342014 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cp2k7"] Apr 17 16:32:04.342185 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.342142 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:04.342264 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:04.342244 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:32:04.342736 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.342713 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsg2s"] Apr 17 16:32:04.343407 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.343387 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-29tlc"] Apr 17 16:32:04.458337 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.458309 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="4f49b525b48f6e0d9ace32bf8ca8c00230aec8bb1c7ff19e92d31af2b5569e56" exitCode=0 Apr 17 16:32:04.458988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.458408 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:04.458988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.458404 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"4f49b525b48f6e0d9ace32bf8ca8c00230aec8bb1c7ff19e92d31af2b5569e56"} Apr 17 16:32:04.458988 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.458417 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:04.458988 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:04.458574 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:04.458988 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:04.458626 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:04.624021 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:04.623954 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p555x_883f4572-082b-45cf-809b-87efb82fbb9c/node-ca/0.log" Apr 17 16:32:06.321594 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:06.321569 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:06.322156 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:06.321571 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:06.322156 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:06.321690 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:06.322156 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:06.321571 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:06.322156 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:06.321737 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:06.322156 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:06.321812 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:32:08.321933 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:08.321884 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:08.321933 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:08.321926 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:08.322435 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:08.321969 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:08.322435 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:08.322050 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsg2s" podUID="60cbc498-937e-4f93-95af-294c0a8e7beb" Apr 17 16:32:08.322435 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:08.322166 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cp2k7" podUID="8405d132-1e05-4ddb-89bd-dcec490db483" Apr 17 16:32:08.322435 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:08.322233 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-29tlc" podUID="855921ad-75be-4568-9884-d3f6c5e1a862" Apr 17 16:32:09.941631 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:09.941547 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:09.942071 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:09.941705 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:09.942071 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:09.941783 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs podName:60cbc498-937e-4f93-95af-294c0a8e7beb nodeName:}" failed. No retries permitted until 2026-04-17 16:32:41.941761108 +0000 UTC m=+65.180276866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs") pod "network-metrics-daemon-zsg2s" (UID: "60cbc498-937e-4f93-95af-294c0a8e7beb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:10.042190 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.042161 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:10.042329 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.042281 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:32:10.042329 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.042295 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:32:10.042329 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.042303 2548 projected.go:194] Error preparing data for projected volume kube-api-access-n7dbk for pod openshift-network-diagnostics/network-check-target-29tlc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:10.042442 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.042346 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk podName:855921ad-75be-4568-9884-d3f6c5e1a862 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:42.04233377 +0000 UTC m=+65.280849526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7dbk" (UniqueName: "kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk") pod "network-check-target-29tlc" (UID: "855921ad-75be-4568-9884-d3f6c5e1a862") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:10.067360 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.067339 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeReady" Apr 17 16:32:10.067470 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.067438 2548 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:32:10.097924 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.097882 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6986bdc748-x8wgt"] Apr 17 16:32:10.126414 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.126390 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-96878"] Apr 17 16:32:10.126566 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.126549 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.130766 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.129784 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:32:10.130766 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.130165 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:32:10.130766 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.130639 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-psh7p\"" Apr 17 16:32:10.131362 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.131315 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:32:10.135727 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.135703 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:32:10.148139 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.148117 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-t44zr"] Apr 17 16:32:10.148306 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.148287 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.150760 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.150739 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cnl7t\"" Apr 17 16:32:10.150994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.150939 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:32:10.150994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.150970 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:32:10.151115 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.150978 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:32:10.167603 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.167581 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6986bdc748-x8wgt"] Apr 17 16:32:10.167603 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.167602 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96878"] Apr 17 16:32:10.167751 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.167611 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6w5xn"] Apr 17 16:32:10.167751 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.167745 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.169981 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.169963 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:32:10.170078 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.170001 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vwmhs\"" Apr 17 16:32:10.170078 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.170034 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:32:10.170078 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.170074 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:32:10.170325 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.170267 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:32:10.188662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.188643 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t44zr"] Apr 17 16:32:10.188662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.188664 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w5xn"] Apr 17 16:32:10.188791 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.188750 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.190849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.190831 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:32:10.190849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.190845 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:32:10.190990 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.190832 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xz9tf\"" Apr 17 16:32:10.244029 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244004 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.244123 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244034 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-api-access-vrnnt\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.244123 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244054 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.244123 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244082 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244123 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244111 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244249 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244129 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244249 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244150 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244249 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244203 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244249 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244244 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcxt\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244359 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244278 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58c1791e-3029-4f4d-be45-e94ca7e72a6e-data-volume\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.244359 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244312 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244359 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244335 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9crd\" (UniqueName: \"kubernetes.io/projected/2e5a8842-0933-4179-9241-eb13bf048769-kube-api-access-z9crd\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.244443 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244371 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.244443 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244388 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.244443 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.244407 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58c1791e-3029-4f4d-be45-e94ca7e72a6e-crio-socket\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.321505 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.321483 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:10.321593 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.321485 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:10.321715 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.321486 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:10.323711 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.323691 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:10.323816 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.323699 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:32:10.323816 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.323699 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:10.323963 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.323819 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fwf7s\"" Apr 17 16:32:10.324027 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.324003 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:32:10.324089 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.324028 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:10.345313 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345295 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.345399 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345323 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.345399 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345344 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcxt\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.345399 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345359 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58c1791e-3029-4f4d-be45-e94ca7e72a6e-data-volume\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.345554 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345538 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpd9q\" (UniqueName: \"kubernetes.io/projected/d91bd4ff-4efd-449d-b375-d8843508d28c-kube-api-access-qpd9q\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.345610 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345587 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.345610 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345603 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58c1791e-3029-4f4d-be45-e94ca7e72a6e-data-volume\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.345712 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345619 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9crd\" (UniqueName: \"kubernetes.io/projected/2e5a8842-0933-4179-9241-eb13bf048769-kube-api-access-z9crd\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.345712 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345647 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91bd4ff-4efd-449d-b375-d8843508d28c-config-volume\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.345712 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345672 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d91bd4ff-4efd-449d-b375-d8843508d28c-tmp-dir\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.345878 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345728 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.345878 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345758 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.345878 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345789 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58c1791e-3029-4f4d-be45-e94ca7e72a6e-crio-socket\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.345878 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345814 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.345878 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.345865 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.345884 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6986bdc748-x8wgt: secret "image-registry-tls" not found Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.345965 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345988 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58c1791e-3029-4f4d-be45-e94ca7e72a6e-crio-socket\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346001 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.345970 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls podName:330bfc85-4775-4a2f-91d9-0120c5940a66 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.845951418 +0000 UTC m=+34.084467192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls") pod "image-registry-6986bdc748-x8wgt" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66") : secret "image-registry-tls" not found Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.345865 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.346048 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls podName:58c1791e-3029-4f4d-be45-e94ca7e72a6e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.846029292 +0000 UTC m=+34.084545049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-t44zr" (UID: "58c1791e-3029-4f4d-be45-e94ca7e72a6e") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346082 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-api-access-vrnnt\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.346125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346107 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346139 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346169 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.346184 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346195 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.346226 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert podName:2e5a8842-0933-4179-9241-eb13bf048769 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.846212307 +0000 UTC m=+34.084728064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert") pod "ingress-canary-96878" (UID: "2e5a8842-0933-4179-9241-eb13bf048769") : secret "canary-serving-cert" not found Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346273 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346299 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.346483 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.346431 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.349671 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.349554 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.349747 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.349655 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.354647 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.354629 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcxt\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.354769 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.354679 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9crd\" (UniqueName: \"kubernetes.io/projected/2e5a8842-0933-4179-9241-eb13bf048769-kube-api-access-z9crd\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.354769 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.354688 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/58c1791e-3029-4f4d-be45-e94ca7e72a6e-kube-api-access-vrnnt\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.355403 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.355380 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.447146 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447088 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpd9q\" (UniqueName: \"kubernetes.io/projected/d91bd4ff-4efd-449d-b375-d8843508d28c-kube-api-access-qpd9q\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.447146 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447122 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91bd4ff-4efd-449d-b375-d8843508d28c-config-volume\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.447146 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447137 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d91bd4ff-4efd-449d-b375-d8843508d28c-tmp-dir\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.447346 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447180 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.450378 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.447650 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:10.450378 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.447732 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls podName:d91bd4ff-4efd-449d-b375-d8843508d28c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.947714858 +0000 UTC m=+34.186230632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls") pod "dns-default-6w5xn" (UID: "d91bd4ff-4efd-449d-b375-d8843508d28c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:10.450378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447811 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d91bd4ff-4efd-449d-b375-d8843508d28c-tmp-dir\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.450378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.447949 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91bd4ff-4efd-449d-b375-d8843508d28c-config-volume\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.455201 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.455180 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpd9q\" (UniqueName: \"kubernetes.io/projected/d91bd4ff-4efd-449d-b375-d8843508d28c-kube-api-access-qpd9q\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.851058 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.851026 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.851077 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.851100 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851187 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851191 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851217 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6986bdc748-x8wgt: secret "image-registry-tls" not found Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851229 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert podName:2e5a8842-0933-4179-9241-eb13bf048769 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.85121682 +0000 UTC m=+35.089732577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert") pod "ingress-canary-96878" (UID: "2e5a8842-0933-4179-9241-eb13bf048769") : secret "canary-serving-cert" not found Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851282 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls podName:330bfc85-4775-4a2f-91d9-0120c5940a66 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.851264901 +0000 UTC m=+35.089780676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls") pod "image-registry-6986bdc748-x8wgt" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66") : secret "image-registry-tls" not found Apr 17 16:32:10.851306 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851193 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:10.851581 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.851347 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls podName:58c1791e-3029-4f4d-be45-e94ca7e72a6e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.851335246 +0000 UTC m=+35.089851002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-t44zr" (UID: "58c1791e-3029-4f4d-be45-e94ca7e72a6e") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:10.951597 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:10.951567 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:10.951943 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.951690 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:10.951943 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:10.951736 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls podName:d91bd4ff-4efd-449d-b375-d8843508d28c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.951725081 +0000 UTC m=+35.190240837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls") pod "dns-default-6w5xn" (UID: "d91bd4ff-4efd-449d-b375-d8843508d28c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:11.858815 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.858779 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls\") pod \"image-registry-6986bdc748-x8wgt\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.858835 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.858864 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.858945 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.858962 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6986bdc748-x8wgt: secret "image-registry-tls" not found Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.858975 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.858991 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:11.859019 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.859015 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls podName:330bfc85-4775-4a2f-91d9-0120c5940a66 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:13.859000974 +0000 UTC m=+37.097516730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls") pod "image-registry-6986bdc748-x8wgt" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66") : secret "image-registry-tls" not found Apr 17 16:32:11.859268 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.859034 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls podName:58c1791e-3029-4f4d-be45-e94ca7e72a6e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:13.859021298 +0000 UTC m=+37.097537055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-t44zr" (UID: "58c1791e-3029-4f4d-be45-e94ca7e72a6e") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:11.859268 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:11.859047 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert podName:2e5a8842-0933-4179-9241-eb13bf048769 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:13.859041203 +0000 UTC m=+37.097556960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert") pod "ingress-canary-96878" (UID: "2e5a8842-0933-4179-9241-eb13bf048769") : secret "canary-serving-cert" not found Apr 17 16:32:11.959344 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.959319 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:11.973080 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.973059 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d91bd4ff-4efd-449d-b375-d8843508d28c-metrics-tls\") pod \"dns-default-6w5xn\" (UID: \"d91bd4ff-4efd-449d-b375-d8843508d28c\") " pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:11.997211 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:11.997184 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:12.115475 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.115416 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6986bdc748-x8wgt"] Apr 17 16:32:12.115713 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:12.115683 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" podUID="330bfc85-4775-4a2f-91d9-0120c5940a66" Apr 17 16:32:12.155958 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.155933 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:32:12.178751 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:12.178725 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd91bd4ff_4efd_449d_b375_d8843508d28c.slice/crio-08780d244cada6bbd2017b44ac7fa7ecab874cd02f71e40f74edbe1b184783c9 WatchSource:0}: Error finding container 08780d244cada6bbd2017b44ac7fa7ecab874cd02f71e40f74edbe1b184783c9: Status 404 returned error can't find the container with id 08780d244cada6bbd2017b44ac7fa7ecab874cd02f71e40f74edbe1b184783c9 Apr 17 16:32:12.179386 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.179372 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:32:12.179452 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.179396 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w5xn"] Apr 17 16:32:12.179524 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.179511 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261238 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261214 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261368 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261247 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261368 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261269 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261451 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261365 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261451 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261439 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261459 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzk9\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261479 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.261512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.261501 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362301 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362267 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362301 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362304 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362449 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362328 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362495 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362476 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362567 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362549 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362618 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362580 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzk9\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362618 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362606 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362703 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362624 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.362703 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.362633 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.363154 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.363130 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.365607 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.365556 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.365710 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.365672 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.365788 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.365770 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.371427 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.371410 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzk9\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.372257 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.372235 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.374686 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.374667 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca\") pod \"image-registry-69d75d96b8-v2j79\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.471191 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.471162 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5xn" event={"ID":"d91bd4ff-4efd-449d-b375-d8843508d28c","Type":"ContainerStarted","Data":"08780d244cada6bbd2017b44ac7fa7ecab874cd02f71e40f74edbe1b184783c9"} Apr 17 16:32:12.471191 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.471200 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:12.475093 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.475076 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:12.489951 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.489929 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:12.565256 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565230 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565365 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565262 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565365 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565285 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565365 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565320 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565595 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565369 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzcxt\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565595 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565410 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565595 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565436 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets\") pod \"330bfc85-4775-4a2f-91d9-0120c5940a66\" (UID: \"330bfc85-4775-4a2f-91d9-0120c5940a66\") " Apr 17 16:32:12.565746 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565604 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:32:12.565746 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565692 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:32:12.565845 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565741 2548 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-certificates\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.565914 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.565835 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:32:12.567796 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.567765 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:32:12.567884 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.567824 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:32:12.568150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.568130 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt" (OuterVolumeSpecName: "kube-api-access-bzcxt") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "kube-api-access-bzcxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:32:12.568150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.568140 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "330bfc85-4775-4a2f-91d9-0120c5940a66" (UID: "330bfc85-4775-4a2f-91d9-0120c5940a66"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:32:12.604204 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.604178 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:32:12.607615 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:12.607590 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f757ddb_4ea6_4bda_8b90_920ac807f2eb.slice/crio-20d159d1996a0e1614ccfa8b0bf3e0e6404e32e0ddadf06adecbaca0203dc4d7 WatchSource:0}: Error finding container 20d159d1996a0e1614ccfa8b0bf3e0e6404e32e0ddadf06adecbaca0203dc4d7: Status 404 returned error can't find the container with id 20d159d1996a0e1614ccfa8b0bf3e0e6404e32e0ddadf06adecbaca0203dc4d7 Apr 17 16:32:12.666467 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666441 2548 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-bound-sa-token\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.666538 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666475 2548 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/330bfc85-4775-4a2f-91d9-0120c5940a66-ca-trust-extracted\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.666538 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666491 2548 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-image-registry-private-configuration\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.666538 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666507 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzcxt\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-kube-api-access-bzcxt\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.666538 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666520 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330bfc85-4775-4a2f-91d9-0120c5940a66-trusted-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:12.666538 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:12.666531 2548 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/330bfc85-4775-4a2f-91d9-0120c5940a66-installation-pull-secrets\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:13.475144 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.474944 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" event={"ID":"7f757ddb-4ea6-4bda-8b90-920ac807f2eb","Type":"ContainerStarted","Data":"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675"} Apr 17 16:32:13.475707 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.475163 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:13.475707 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.475178 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" event={"ID":"7f757ddb-4ea6-4bda-8b90-920ac807f2eb","Type":"ContainerStarted","Data":"20d159d1996a0e1614ccfa8b0bf3e0e6404e32e0ddadf06adecbaca0203dc4d7"} Apr 17 16:32:13.482786 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.482762 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6986bdc748-x8wgt" Apr 17 16:32:13.506269 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.506224 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" podStartSLOduration=1.506209874 podStartE2EDuration="1.506209874s" podCreationTimestamp="2026-04-17 16:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:13.505788699 +0000 UTC m=+36.744304489" watchObservedRunningTime="2026-04-17 16:32:13.506209874 +0000 UTC m=+36.744725657" Apr 17 16:32:13.540448 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.540421 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6986bdc748-x8wgt"] Apr 17 16:32:13.545647 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.545620 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6986bdc748-x8wgt"] Apr 17 16:32:13.674997 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.674933 2548 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/330bfc85-4775-4a2f-91d9-0120c5940a66-registry-tls\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:32:13.877153 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.877126 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:13.877305 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.877156 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:13.880130 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.880109 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58c1791e-3029-4f4d-be45-e94ca7e72a6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t44zr\" (UID: \"58c1791e-3029-4f4d-be45-e94ca7e72a6e\") " pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:13.880241 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:13.880179 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5a8842-0933-4179-9241-eb13bf048769-cert\") pod \"ingress-canary-96878\" (UID: \"2e5a8842-0933-4179-9241-eb13bf048769\") " pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:14.056553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.056527 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96878" Apr 17 16:32:14.078589 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.078565 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t44zr" Apr 17 16:32:14.204983 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.204705 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96878"] Apr 17 16:32:14.217045 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:14.217013 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e5a8842_0933_4179_9241_eb13bf048769.slice/crio-a7b597966ef1424859c43cd5aeabba59f93a5511c3f2e50382f704a02be6b866 WatchSource:0}: Error finding container a7b597966ef1424859c43cd5aeabba59f93a5511c3f2e50382f704a02be6b866: Status 404 returned error can't find the container with id a7b597966ef1424859c43cd5aeabba59f93a5511c3f2e50382f704a02be6b866 Apr 17 16:32:14.229549 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.229530 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t44zr"] Apr 17 16:32:14.237358 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:14.237337 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c1791e_3029_4f4d_be45_e94ca7e72a6e.slice/crio-3f8770f36b931dd51341d9464dda64b7db1c6aad3292dc61668149973686a21b WatchSource:0}: Error finding container 3f8770f36b931dd51341d9464dda64b7db1c6aad3292dc61668149973686a21b: Status 404 returned error can't find the container with id 3f8770f36b931dd51341d9464dda64b7db1c6aad3292dc61668149973686a21b Apr 17 16:32:14.478249 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.478210 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5xn" event={"ID":"d91bd4ff-4efd-449d-b375-d8843508d28c","Type":"ContainerStarted","Data":"5bb934869ed02b30a87240c9ac7f75036f939af43a6f2ffcb5f87f964170e16a"} Apr 17 16:32:14.478783 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.478254 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w5xn" event={"ID":"d91bd4ff-4efd-449d-b375-d8843508d28c","Type":"ContainerStarted","Data":"d5482fd128291d1a17da74c02d058c7e3be39d78fb24038104d0fb8beb300f11"} Apr 17 16:32:14.478783 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.478453 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:14.479578 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.479558 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t44zr" event={"ID":"58c1791e-3029-4f4d-be45-e94ca7e72a6e","Type":"ContainerStarted","Data":"0d3627f13dd53343bf9f47ae3a9952128d3703cfae7a861fb82c60417df4b950"} Apr 17 16:32:14.479649 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.479586 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t44zr" event={"ID":"58c1791e-3029-4f4d-be45-e94ca7e72a6e","Type":"ContainerStarted","Data":"3f8770f36b931dd51341d9464dda64b7db1c6aad3292dc61668149973686a21b"} Apr 17 16:32:14.480555 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.480536 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96878" event={"ID":"2e5a8842-0933-4179-9241-eb13bf048769","Type":"ContainerStarted","Data":"a7b597966ef1424859c43cd5aeabba59f93a5511c3f2e50382f704a02be6b866"} Apr 17 16:32:14.497381 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:14.497344 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6w5xn" podStartSLOduration=2.737581421 podStartE2EDuration="4.497332564s" podCreationTimestamp="2026-04-17 16:32:10 +0000 UTC" firstStartedPulling="2026-04-17 16:32:12.180714359 +0000 UTC m=+35.419230128" lastFinishedPulling="2026-04-17 16:32:13.940465512 +0000 UTC m=+37.178981271" observedRunningTime="2026-04-17 16:32:14.496936478 +0000 UTC m=+37.735452254" watchObservedRunningTime="2026-04-17 16:32:14.497332564 +0000 UTC m=+37.735848321" Apr 17 16:32:14.670565 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:14.670517 2548 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1" Apr 17 16:32:14.670721 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:14.670680 2548 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:routeoverride-cni,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/route-override/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/route-override/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/route-override/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ww2wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-2qscf_openshift-multus(c756a090-293e-4944-9021-f8de796a8b45): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:14.671855 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:14.671825 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"routeoverride-cni\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-2qscf" podUID="c756a090-293e-4944-9021-f8de796a8b45" Apr 17 16:32:15.088349 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.088308 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:15.092577 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.092526 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8405d132-1e05-4ddb-89bd-dcec490db483-original-pull-secret\") pod \"global-pull-secret-syncer-cp2k7\" (UID: \"8405d132-1e05-4ddb-89bd-dcec490db483\") " pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:15.141125 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.141094 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cp2k7" Apr 17 16:32:15.324956 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.324927 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330bfc85-4775-4a2f-91d9-0120c5940a66" path="/var/lib/kubelet/pods/330bfc85-4775-4a2f-91d9-0120c5940a66/volumes" Apr 17 16:32:15.334241 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.334214 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cp2k7"] Apr 17 16:32:15.338325 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:15.338301 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8405d132_1e05_4ddb_89bd_dcec490db483.slice/crio-6b25c6ac68b09f2047da467b49344a43246e2c8a7bf86a0a6aa23ac0d4936b0f WatchSource:0}: Error finding container 6b25c6ac68b09f2047da467b49344a43246e2c8a7bf86a0a6aa23ac0d4936b0f: Status 404 returned error can't find the container with id 6b25c6ac68b09f2047da467b49344a43246e2c8a7bf86a0a6aa23ac0d4936b0f Apr 17 16:32:15.483589 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:15.483544 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cp2k7" event={"ID":"8405d132-1e05-4ddb-89bd-dcec490db483","Type":"ContainerStarted","Data":"6b25c6ac68b09f2047da467b49344a43246e2c8a7bf86a0a6aa23ac0d4936b0f"} Apr 17 16:32:15.563038 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:15.563007 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"routeoverride-cni\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:18d788b8deb049d24dfdb101371f6d2211a5e731bacd64b08adb97f66b6c4eb1: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-2qscf" podUID="c756a090-293e-4944-9021-f8de796a8b45" Apr 17 16:32:16.487864 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:16.487606 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t44zr" event={"ID":"58c1791e-3029-4f4d-be45-e94ca7e72a6e","Type":"ContainerStarted","Data":"567991bbd1a5dba00f069e9767b033272641b62bc974a2617b477a62b11de033"} Apr 17 16:32:17.491878 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:17.491834 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96878" event={"ID":"2e5a8842-0933-4179-9241-eb13bf048769","Type":"ContainerStarted","Data":"64702655684c55983ef559575a786c8441121394d3e00661a2362276fcdf8b27"} Apr 17 16:32:17.510204 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:17.510154 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-96878" podStartSLOduration=5.2390594329999995 podStartE2EDuration="7.510138913s" podCreationTimestamp="2026-04-17 16:32:10 +0000 UTC" firstStartedPulling="2026-04-17 16:32:14.2187852 +0000 UTC m=+37.457300960" lastFinishedPulling="2026-04-17 16:32:16.489864677 +0000 UTC m=+39.728380440" observedRunningTime="2026-04-17 16:32:17.509354762 +0000 UTC m=+40.747870542" watchObservedRunningTime="2026-04-17 16:32:17.510138913 +0000 UTC m=+40.748654691" Apr 17 16:32:20.500304 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:20.500259 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t44zr" event={"ID":"58c1791e-3029-4f4d-be45-e94ca7e72a6e","Type":"ContainerStarted","Data":"779ac36fd3f9c195ce78d660454c57b107bee734e9c9e08f7ac10eda8ebb7f1d"} Apr 17 16:32:20.501403 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:20.501381 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cp2k7" event={"ID":"8405d132-1e05-4ddb-89bd-dcec490db483","Type":"ContainerStarted","Data":"b9db66a4ef344215da88c71f828266cb1c22bc18dd22d53d6a898823bfca3d1e"} Apr 17 16:32:20.516348 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:20.516305 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-t44zr" podStartSLOduration=5.200602806 podStartE2EDuration="10.5162935s" podCreationTimestamp="2026-04-17 16:32:10 +0000 UTC" firstStartedPulling="2026-04-17 16:32:14.324855629 +0000 UTC m=+37.563371395" lastFinishedPulling="2026-04-17 16:32:19.64054633 +0000 UTC m=+42.879062089" observedRunningTime="2026-04-17 16:32:20.515281029 +0000 UTC m=+43.753796808" watchObservedRunningTime="2026-04-17 16:32:20.5162935 +0000 UTC m=+43.754809270" Apr 17 16:32:20.528478 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:20.528421 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cp2k7" podStartSLOduration=32.913338165 podStartE2EDuration="37.528407548s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:32:15.339866396 +0000 UTC m=+38.578382156" lastFinishedPulling="2026-04-17 16:32:19.954935776 +0000 UTC m=+43.193451539" observedRunningTime="2026-04-17 16:32:20.527964108 +0000 UTC m=+43.766479887" watchObservedRunningTime="2026-04-17 16:32:20.528407548 +0000 UTC m=+43.766923329" Apr 17 16:32:24.486163 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:24.486134 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6w5xn" Apr 17 16:32:27.516797 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:27.516716 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="205877ebb15fe9384fe3456196f6d022dba8d28275c660e2c75be68726c4e269" exitCode=0 Apr 17 16:32:27.516797 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:27.516764 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"205877ebb15fe9384fe3456196f6d022dba8d28275c660e2c75be68726c4e269"} Apr 17 16:32:32.496260 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:32.496220 2548 patch_prober.go:28] interesting pod/image-registry-69d75d96b8-v2j79 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:32:32.496693 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:32.496304 2548 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:32:33.531940 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:33.531880 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="2b950be5d77cedf8c4e2c2816d5eb0871763e69d55685654cbbe4c81596398d8" exitCode=0 Apr 17 16:32:33.531940 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:33.531930 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"2b950be5d77cedf8c4e2c2816d5eb0871763e69d55685654cbbe4c81596398d8"} Apr 17 16:32:34.476524 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:34.476493 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lq8np" Apr 17 16:32:34.485051 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:34.485021 2548 patch_prober.go:28] interesting pod/image-registry-69d75d96b8-v2j79 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:32:34.485177 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:34.485064 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:32:34.538416 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:34.538384 2548 generic.go:358] "Generic (PLEG): container finished" podID="c756a090-293e-4944-9021-f8de796a8b45" containerID="f7c422878d4a5d8dd518ffe0d977c819122702fabce7b8737c451e7f05f1ca59" exitCode=0 Apr 17 16:32:34.538785 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:34.538439 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerDied","Data":"f7c422878d4a5d8dd518ffe0d977c819122702fabce7b8737c451e7f05f1ca59"} Apr 17 16:32:35.543512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:35.543468 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2qscf" event={"ID":"c756a090-293e-4944-9021-f8de796a8b45","Type":"ContainerStarted","Data":"d2764cb19f44559cc0ccec6db364ddfc1a3e22f7aaa013920da8f01e784d19f6"} Apr 17 16:32:35.568186 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:35.568139 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2qscf" podStartSLOduration=4.247488289 podStartE2EDuration="58.568127532s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:31:38.539883658 +0000 UTC m=+1.778399417" lastFinishedPulling="2026-04-17 16:32:32.860522899 +0000 UTC m=+56.099038660" observedRunningTime="2026-04-17 16:32:35.56777379 +0000 UTC m=+58.806289604" watchObservedRunningTime="2026-04-17 16:32:35.568127532 +0000 UTC m=+58.806643310" Apr 17 16:32:36.545889 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.545860 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-flqq6"] Apr 17 16:32:36.548918 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.548877 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:32:36.549049 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.549012 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:36.551324 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.551306 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:32:36.551324 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.551317 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:32:36.551493 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.551473 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-25lv8\"" Apr 17 16:32:36.551561 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.551543 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.553964 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.553947 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:32:36.554855 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.554828 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:32:36.554977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.554881 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:32:36.555156 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.555140 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:32:36.555407 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.555394 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:32:36.555997 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.555962 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4nh7g\"" Apr 17 16:32:36.559397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.559377 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:32:36.562198 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.562177 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-flqq6"] Apr 17 16:32:36.579603 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.579584 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:32:36.641421 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.641395 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4"] Apr 17 16:32:36.644234 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.644220 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:36.646935 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.646919 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qmbjq\"" Apr 17 16:32:36.646996 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.646955 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:32:36.651436 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651417 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651522 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651448 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651522 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651501 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7t8\" (UniqueName: \"kubernetes.io/projected/21c90255-d7fb-4403-bf8c-e896082f1d3c-kube-api-access-2w7t8\") pod \"downloads-6bcc868b7-flqq6\" (UID: \"21c90255-d7fb-4403-bf8c-e896082f1d3c\") " pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:36.651628 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651531 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651628 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651559 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651628 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651622 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651746 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651645 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.651746 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.651667 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bh8p\" (UniqueName: \"kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.654522 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.654506 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4"] Apr 17 16:32:36.752508 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752488 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f56f8820-e530-469d-92c4-2ff27422a302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgbt4\" (UID: \"f56f8820-e530-469d-92c4-2ff27422a302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:36.752614 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752541 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7t8\" (UniqueName: \"kubernetes.io/projected/21c90255-d7fb-4403-bf8c-e896082f1d3c-kube-api-access-2w7t8\") pod \"downloads-6bcc868b7-flqq6\" (UID: \"21c90255-d7fb-4403-bf8c-e896082f1d3c\") " pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:36.752614 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752567 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.752614 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752609 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.752744 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752630 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.752744 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752656 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.752744 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752683 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bh8p\" (UniqueName: \"kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.752744 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.752716 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.753953 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.753661 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.753953 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.753667 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.753953 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.753789 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.754163 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.754062 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.754264 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.754240 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.757912 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.757693 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.758852 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.758831 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.760843 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.760818 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7t8\" (UniqueName: \"kubernetes.io/projected/21c90255-d7fb-4403-bf8c-e896082f1d3c-kube-api-access-2w7t8\") pod \"downloads-6bcc868b7-flqq6\" (UID: \"21c90255-d7fb-4403-bf8c-e896082f1d3c\") " pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:36.761315 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.761298 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bh8p\" (UniqueName: \"kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p\") pod \"console-759c5d468-wd5lm\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.854499 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.854411 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f56f8820-e530-469d-92c4-2ff27422a302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgbt4\" (UID: \"f56f8820-e530-469d-92c4-2ff27422a302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:36.856940 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.856887 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f56f8820-e530-469d-92c4-2ff27422a302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgbt4\" (UID: \"f56f8820-e530-469d-92c4-2ff27422a302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:36.859642 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.859624 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:36.865429 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.865404 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:36.952879 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.952777 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:36.989468 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:36.989439 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-flqq6"] Apr 17 16:32:36.993022 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:36.992986 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c90255_d7fb_4403_bf8c_e896082f1d3c.slice/crio-7d93ae465e379d4052b75729d6444c6bbef3cb1dd6e7f660a6b20d86b21d466c WatchSource:0}: Error finding container 7d93ae465e379d4052b75729d6444c6bbef3cb1dd6e7f660a6b20d86b21d466c: Status 404 returned error can't find the container with id 7d93ae465e379d4052b75729d6444c6bbef3cb1dd6e7f660a6b20d86b21d466c Apr 17 16:32:37.004142 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:37.004100 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:32:37.010552 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:37.010525 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbfca93e_1f1e_4866_9f9e_2417644d7ee4.slice/crio-79123e9866e6565f8dc377fc7ec0e391b0d6e2a79497914445f1cf34d26288d6 WatchSource:0}: Error finding container 79123e9866e6565f8dc377fc7ec0e391b0d6e2a79497914445f1cf34d26288d6: Status 404 returned error can't find the container with id 79123e9866e6565f8dc377fc7ec0e391b0d6e2a79497914445f1cf34d26288d6 Apr 17 16:32:37.076083 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:37.076056 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4"] Apr 17 16:32:37.079428 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:37.079399 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56f8820_e530_469d_92c4_2ff27422a302.slice/crio-0ca75f2c3d353f09aaff70c7d40cb78728c464839f196113de3a01b148cdcd9b WatchSource:0}: Error finding container 0ca75f2c3d353f09aaff70c7d40cb78728c464839f196113de3a01b148cdcd9b: Status 404 returned error can't find the container with id 0ca75f2c3d353f09aaff70c7d40cb78728c464839f196113de3a01b148cdcd9b Apr 17 16:32:37.550432 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:37.550391 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" event={"ID":"f56f8820-e530-469d-92c4-2ff27422a302","Type":"ContainerStarted","Data":"0ca75f2c3d353f09aaff70c7d40cb78728c464839f196113de3a01b148cdcd9b"} Apr 17 16:32:37.551441 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:37.551418 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-flqq6" event={"ID":"21c90255-d7fb-4403-bf8c-e896082f1d3c","Type":"ContainerStarted","Data":"7d93ae465e379d4052b75729d6444c6bbef3cb1dd6e7f660a6b20d86b21d466c"} Apr 17 16:32:37.552417 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:37.552395 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c5d468-wd5lm" event={"ID":"fbfca93e-1f1e-4866-9f9e-2417644d7ee4","Type":"ContainerStarted","Data":"79123e9866e6565f8dc377fc7ec0e391b0d6e2a79497914445f1cf34d26288d6"} Apr 17 16:32:38.557175 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:38.556742 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" event={"ID":"f56f8820-e530-469d-92c4-2ff27422a302","Type":"ContainerStarted","Data":"9fec8ac7242960c14b4b405ad3cfef79ab33d68db1b81b944eead5f3bd000789"} Apr 17 16:32:38.557175 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:38.557072 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:38.563352 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:38.563330 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" Apr 17 16:32:38.572458 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:38.572413 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgbt4" podStartSLOduration=1.277517301 podStartE2EDuration="2.572402565s" podCreationTimestamp="2026-04-17 16:32:36 +0000 UTC" firstStartedPulling="2026-04-17 16:32:37.081289958 +0000 UTC m=+60.319805719" lastFinishedPulling="2026-04-17 16:32:38.376175225 +0000 UTC m=+61.614690983" observedRunningTime="2026-04-17 16:32:38.5716184 +0000 UTC m=+61.810134180" watchObservedRunningTime="2026-04-17 16:32:38.572402565 +0000 UTC m=+61.810918387" Apr 17 16:32:40.564315 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:40.564277 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c5d468-wd5lm" event={"ID":"fbfca93e-1f1e-4866-9f9e-2417644d7ee4","Type":"ContainerStarted","Data":"1deb50839f6ca060902a2363b5b1e2336e4cc48fb2b48a64e4eb6c56473d08d5"} Apr 17 16:32:40.582131 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:40.582064 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-759c5d468-wd5lm" podStartSLOduration=1.3979459410000001 podStartE2EDuration="4.582046985s" podCreationTimestamp="2026-04-17 16:32:36 +0000 UTC" firstStartedPulling="2026-04-17 16:32:37.013242212 +0000 UTC m=+60.251757973" lastFinishedPulling="2026-04-17 16:32:40.197343253 +0000 UTC m=+63.435859017" observedRunningTime="2026-04-17 16:32:40.581324136 +0000 UTC m=+63.819839916" watchObservedRunningTime="2026-04-17 16:32:40.582046985 +0000 UTC m=+63.820562767" Apr 17 16:32:41.999444 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:41.999404 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:42.002225 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.002204 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:42.012796 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.012773 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60cbc498-937e-4f93-95af-294c0a8e7beb-metrics-certs\") pod \"network-metrics-daemon-zsg2s\" (UID: \"60cbc498-937e-4f93-95af-294c0a8e7beb\") " pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:42.100228 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.100198 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:42.102827 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.102798 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:42.113019 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.112997 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:42.123629 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.123593 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7dbk\" (UniqueName: \"kubernetes.io/projected/855921ad-75be-4568-9884-d3f6c5e1a862-kube-api-access-n7dbk\") pod \"network-check-target-29tlc\" (UID: \"855921ad-75be-4568-9884-d3f6c5e1a862\") " pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:42.132541 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.132522 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:32:42.138272 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.138254 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fwf7s\"" Apr 17 16:32:42.140938 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.140919 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsg2s" Apr 17 16:32:42.146788 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.146764 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:42.284489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.284457 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-29tlc"] Apr 17 16:32:42.286375 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:42.286347 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod855921ad_75be_4568_9884_d3f6c5e1a862.slice/crio-6d19daf4c6150b1add32dc8e89674a216f7335025018de8c4d2e9c0dc5a98dab WatchSource:0}: Error finding container 6d19daf4c6150b1add32dc8e89674a216f7335025018de8c4d2e9c0dc5a98dab: Status 404 returned error can't find the container with id 6d19daf4c6150b1add32dc8e89674a216f7335025018de8c4d2e9c0dc5a98dab Apr 17 16:32:42.299200 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.299177 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsg2s"] Apr 17 16:32:42.302434 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:42.302408 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60cbc498_937e_4f93_95af_294c0a8e7beb.slice/crio-f6f91fa22516d49d13a78655a6668d8262d93b59fdadb7bf55926c91d5442b6a WatchSource:0}: Error finding container f6f91fa22516d49d13a78655a6668d8262d93b59fdadb7bf55926c91d5442b6a: Status 404 returned error can't find the container with id f6f91fa22516d49d13a78655a6668d8262d93b59fdadb7bf55926c91d5442b6a Apr 17 16:32:42.494290 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.494257 2548 patch_prober.go:28] interesting pod/image-registry-69d75d96b8-v2j79 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:32:42.494455 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.494310 2548 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:32:42.571188 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.571149 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-29tlc" event={"ID":"855921ad-75be-4568-9884-d3f6c5e1a862","Type":"ContainerStarted","Data":"6d19daf4c6150b1add32dc8e89674a216f7335025018de8c4d2e9c0dc5a98dab"} Apr 17 16:32:42.572399 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:42.572370 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsg2s" event={"ID":"60cbc498-937e-4f93-95af-294c0a8e7beb","Type":"ContainerStarted","Data":"f6f91fa22516d49d13a78655a6668d8262d93b59fdadb7bf55926c91d5442b6a"} Apr 17 16:32:44.486201 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:44.486169 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:32:44.581276 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:44.581230 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsg2s" event={"ID":"60cbc498-937e-4f93-95af-294c0a8e7beb","Type":"ContainerStarted","Data":"a696e285cbee3276098266855d1c68a01f68e60458ace9fc0cef55127f3bf376"} Apr 17 16:32:44.581443 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:44.581280 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsg2s" event={"ID":"60cbc498-937e-4f93-95af-294c0a8e7beb","Type":"ContainerStarted","Data":"95bfc15fe5488e32d55512e4f28b20ff5af27462e31bfb01790e13954278083d"} Apr 17 16:32:46.407209 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.405619 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zsg2s" podStartSLOduration=68.04848004 podStartE2EDuration="1m9.405600365s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:42.304567472 +0000 UTC m=+65.543083230" lastFinishedPulling="2026-04-17 16:32:43.661687792 +0000 UTC m=+66.900203555" observedRunningTime="2026-04-17 16:32:44.599223755 +0000 UTC m=+67.837739533" watchObservedRunningTime="2026-04-17 16:32:46.405600365 +0000 UTC m=+69.644116146" Apr 17 16:32:46.407209 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.406845 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dnmj9"] Apr 17 16:32:46.410445 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.410415 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.412640 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.412616 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pjgf8\"" Apr 17 16:32:46.412772 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.412670 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:32:46.413076 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.413048 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:32:46.413186 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.413161 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:32:46.413186 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.413181 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:32:46.413458 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.413438 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:32:46.413528 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.413480 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:32:46.540602 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540565 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-metrics-client-ca\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540602 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540613 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-wtmp\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540819 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540680 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-sys\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540819 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540753 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540819 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540788 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-accelerators-collector-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540827 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4rwm\" (UniqueName: \"kubernetes.io/projected/8256c616-711b-4108-b911-6d292aed26c2-kube-api-access-w4rwm\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540914 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-textfile\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.540994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540952 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.541120 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.540998 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-root\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.589025 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.588981 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-29tlc" event={"ID":"855921ad-75be-4568-9884-d3f6c5e1a862","Type":"ContainerStarted","Data":"6cde80e2be4ba0dfa21af14719169ca85aeae1dd8cf1a0bb690360b74cffdd38"} Apr 17 16:32:46.589194 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.589129 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:32:46.605160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.605103 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-29tlc" podStartSLOduration=66.31650901 podStartE2EDuration="1m9.605083085s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:42.28847715 +0000 UTC m=+65.526992907" lastFinishedPulling="2026-04-17 16:32:45.577051221 +0000 UTC m=+68.815566982" observedRunningTime="2026-04-17 16:32:46.60420852 +0000 UTC m=+69.842724299" watchObservedRunningTime="2026-04-17 16:32:46.605083085 +0000 UTC m=+69.843598867" Apr 17 16:32:46.642039 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642002 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-textfile\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642063 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642110 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-root\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642142 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-metrics-client-ca\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642170 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-wtmp\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642197 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-sys\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642222 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:46.642240 2548 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642254 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-accelerators-collector-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642282 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4rwm\" (UniqueName: \"kubernetes.io/projected/8256c616-711b-4108-b911-6d292aed26c2-kube-api-access-w4rwm\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:32:46.642318 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls podName:8256c616-711b-4108-b911-6d292aed26c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.14229641 +0000 UTC m=+70.380812173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls") pod "node-exporter-dnmj9" (UID: "8256c616-711b-4108-b911-6d292aed26c2") : secret "node-exporter-tls" not found Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642394 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-root\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642391 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-textfile\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642453 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642448 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-sys\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.642821 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.642642 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-wtmp\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.643068 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.643016 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-metrics-client-ca\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.643929 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.643881 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-accelerators-collector-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.645171 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.645148 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.652344 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.652324 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4rwm\" (UniqueName: \"kubernetes.io/projected/8256c616-711b-4108-b911-6d292aed26c2-kube-api-access-w4rwm\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:46.865863 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.865817 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:46.866053 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.865932 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:46.872481 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:46.872457 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:47.146569 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:47.146476 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:47.149124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:47.149096 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8256c616-711b-4108-b911-6d292aed26c2-node-exporter-tls\") pod \"node-exporter-dnmj9\" (UID: \"8256c616-711b-4108-b911-6d292aed26c2\") " pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:47.320804 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:47.320773 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dnmj9" Apr 17 16:32:47.596747 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:47.596699 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:32:50.694607 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.694575 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fbc7d488c-8tbzf"] Apr 17 16:32:50.699119 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.699096 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.701658 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.701286 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:32:50.701658 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.701514 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 16:32:50.702546 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.702184 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 16:32:50.702546 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.702189 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 16:32:50.702546 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.702397 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d9evuptvssrq8\"" Apr 17 16:32:50.702546 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.702450 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7pl6g\"" Apr 17 16:32:50.706853 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.706828 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fbc7d488c-8tbzf"] Apr 17 16:32:50.780220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780176 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/31746d46-0cf2-488b-9f68-e292c43470a6-audit-log\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780398 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780222 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-tls\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780398 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780299 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-metrics-server-audit-profiles\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780398 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780361 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-client-certs\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780554 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780453 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-client-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780554 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780475 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c88j2\" (UniqueName: \"kubernetes.io/projected/31746d46-0cf2-488b-9f68-e292c43470a6-kube-api-access-c88j2\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.780554 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.780497 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881545 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881507 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-client-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881557 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88j2\" (UniqueName: \"kubernetes.io/projected/31746d46-0cf2-488b-9f68-e292c43470a6-kube-api-access-c88j2\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881588 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881628 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/31746d46-0cf2-488b-9f68-e292c43470a6-audit-log\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881656 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-tls\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881687 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-metrics-server-audit-profiles\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.881726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.881721 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-client-certs\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.882808 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.882755 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.882953 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.882286 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/31746d46-0cf2-488b-9f68-e292c43470a6-audit-log\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.883042 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.883027 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/31746d46-0cf2-488b-9f68-e292c43470a6-metrics-server-audit-profiles\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.884921 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.884882 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-client-ca-bundle\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.888365 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.888318 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-tls\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.890024 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.889997 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c88j2\" (UniqueName: \"kubernetes.io/projected/31746d46-0cf2-488b-9f68-e292c43470a6-kube-api-access-c88j2\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:50.890442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:50.890419 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/31746d46-0cf2-488b-9f68-e292c43470a6-secret-metrics-server-client-certs\") pod \"metrics-server-5fbc7d488c-8tbzf\" (UID: \"31746d46-0cf2-488b-9f68-e292c43470a6\") " pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:51.011518 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.011440 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:32:51.172242 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.172210 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml"] Apr 17 16:32:51.200984 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.200952 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml"] Apr 17 16:32:51.201138 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.201094 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:51.203549 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.203417 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:32:51.203549 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.203509 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-tqnhk\"" Apr 17 16:32:51.285283 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.285245 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0ee333-af57-4d69-90f8-950628bf752e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nxvml\" (UID: \"bd0ee333-af57-4d69-90f8-950628bf752e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:51.386028 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.385990 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0ee333-af57-4d69-90f8-950628bf752e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nxvml\" (UID: \"bd0ee333-af57-4d69-90f8-950628bf752e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:51.388711 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.388686 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0ee333-af57-4d69-90f8-950628bf752e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nxvml\" (UID: \"bd0ee333-af57-4d69-90f8-950628bf752e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:51.512532 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:51.512492 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:53.642301 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:53.642273 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8256c616_711b_4108_b911_6d292aed26c2.slice/crio-aba11588ececb852334abb54c3dc45590a5c03ac692e2eb5df3480bb38b2bb41 WatchSource:0}: Error finding container aba11588ececb852334abb54c3dc45590a5c03ac692e2eb5df3480bb38b2bb41: Status 404 returned error can't find the container with id aba11588ececb852334abb54c3dc45590a5c03ac692e2eb5df3480bb38b2bb41 Apr 17 16:32:53.766679 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:53.766505 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml"] Apr 17 16:32:53.785334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:53.785304 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fbc7d488c-8tbzf"] Apr 17 16:32:53.868500 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:53.868471 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0ee333_af57_4d69_90f8_950628bf752e.slice/crio-e5decefa949b321f797362977877e909c8210ffe25ea38377bc4125d6304a12c WatchSource:0}: Error finding container e5decefa949b321f797362977877e909c8210ffe25ea38377bc4125d6304a12c: Status 404 returned error can't find the container with id e5decefa949b321f797362977877e909c8210ffe25ea38377bc4125d6304a12c Apr 17 16:32:53.869361 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:32:53.869321 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31746d46_0cf2_488b_9f68_e292c43470a6.slice/crio-fb2399ab45757fa947a3a21699ce46e6fdd2b0f30d9dcf0c2ad12fe11cd3dbec WatchSource:0}: Error finding container fb2399ab45757fa947a3a21699ce46e6fdd2b0f30d9dcf0c2ad12fe11cd3dbec: Status 404 returned error can't find the container with id fb2399ab45757fa947a3a21699ce46e6fdd2b0f30d9dcf0c2ad12fe11cd3dbec Apr 17 16:32:54.620634 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.617852 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-flqq6" event={"ID":"21c90255-d7fb-4403-bf8c-e896082f1d3c","Type":"ContainerStarted","Data":"36b4dc0f1fe6a93cb2be473523c2c836c2581f8a016179783f2f1a3673ed5287"} Apr 17 16:32:54.621109 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.620844 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:54.622492 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.622442 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" event={"ID":"31746d46-0cf2-488b-9f68-e292c43470a6","Type":"ContainerStarted","Data":"fb2399ab45757fa947a3a21699ce46e6fdd2b0f30d9dcf0c2ad12fe11cd3dbec"} Apr 17 16:32:54.625662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.625575 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dnmj9" event={"ID":"8256c616-711b-4108-b911-6d292aed26c2","Type":"ContainerStarted","Data":"aba11588ececb852334abb54c3dc45590a5c03ac692e2eb5df3480bb38b2bb41"} Apr 17 16:32:54.627002 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.626963 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" event={"ID":"bd0ee333-af57-4d69-90f8-950628bf752e","Type":"ContainerStarted","Data":"e5decefa949b321f797362977877e909c8210ffe25ea38377bc4125d6304a12c"} Apr 17 16:32:54.633076 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.633038 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-flqq6" Apr 17 16:32:54.636556 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:54.636514 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-flqq6" podStartSLOduration=1.699262956 podStartE2EDuration="18.636501237s" podCreationTimestamp="2026-04-17 16:32:36 +0000 UTC" firstStartedPulling="2026-04-17 16:32:36.995273047 +0000 UTC m=+60.233788808" lastFinishedPulling="2026-04-17 16:32:53.932511313 +0000 UTC m=+77.171027089" observedRunningTime="2026-04-17 16:32:54.634777243 +0000 UTC m=+77.873293147" watchObservedRunningTime="2026-04-17 16:32:54.636501237 +0000 UTC m=+77.875017016" Apr 17 16:32:55.632937 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:55.632873 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dnmj9" event={"ID":"8256c616-711b-4108-b911-6d292aed26c2","Type":"ContainerStarted","Data":"b7bdf2713658afd59c0176d773065a3b1891bc483cbf4c07a7447a036204d5e4"} Apr 17 16:32:56.637683 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:56.637652 2548 generic.go:358] "Generic (PLEG): container finished" podID="8256c616-711b-4108-b911-6d292aed26c2" containerID="b7bdf2713658afd59c0176d773065a3b1891bc483cbf4c07a7447a036204d5e4" exitCode=0 Apr 17 16:32:56.638138 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:56.637745 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dnmj9" event={"ID":"8256c616-711b-4108-b911-6d292aed26c2","Type":"ContainerDied","Data":"b7bdf2713658afd59c0176d773065a3b1891bc483cbf4c07a7447a036204d5e4"} Apr 17 16:32:57.642360 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.642317 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" event={"ID":"bd0ee333-af57-4d69-90f8-950628bf752e","Type":"ContainerStarted","Data":"c1ccd4cf5f75eb8d84f1d8a5cd51c1bd80ece7997d22e4f01da52ccdb39bdbe5"} Apr 17 16:32:57.642815 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.642572 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:57.644201 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.644168 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" event={"ID":"31746d46-0cf2-488b-9f68-e292c43470a6","Type":"ContainerStarted","Data":"4efb577cf148a4e7f774ac355cc9272ee43412f14b26c9625a83ed50dea5bedc"} Apr 17 16:32:57.646523 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.646486 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dnmj9" event={"ID":"8256c616-711b-4108-b911-6d292aed26c2","Type":"ContainerStarted","Data":"cf61722fbf308102355c022098b9d3b986869054843251901a6c22c1599e8374"} Apr 17 16:32:57.646523 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.646519 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dnmj9" event={"ID":"8256c616-711b-4108-b911-6d292aed26c2","Type":"ContainerStarted","Data":"390fd7eada9d0b7739b09b664e28c5c7a20bae61fd9a89a567b0f209c2bd6be7"} Apr 17 16:32:57.647886 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.647866 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" Apr 17 16:32:57.658132 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.658090 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nxvml" podStartSLOduration=3.771627455 podStartE2EDuration="6.658075643s" podCreationTimestamp="2026-04-17 16:32:51 +0000 UTC" firstStartedPulling="2026-04-17 16:32:53.887849942 +0000 UTC m=+77.126365702" lastFinishedPulling="2026-04-17 16:32:56.774298133 +0000 UTC m=+80.012813890" observedRunningTime="2026-04-17 16:32:57.657036362 +0000 UTC m=+80.895552142" watchObservedRunningTime="2026-04-17 16:32:57.658075643 +0000 UTC m=+80.896591425" Apr 17 16:32:57.677095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.677053 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" podStartSLOduration=4.787888981 podStartE2EDuration="7.677042139s" podCreationTimestamp="2026-04-17 16:32:50 +0000 UTC" firstStartedPulling="2026-04-17 16:32:53.887778767 +0000 UTC m=+77.126294527" lastFinishedPulling="2026-04-17 16:32:56.776931921 +0000 UTC m=+80.015447685" observedRunningTime="2026-04-17 16:32:57.675490808 +0000 UTC m=+80.914006586" watchObservedRunningTime="2026-04-17 16:32:57.677042139 +0000 UTC m=+80.915557917" Apr 17 16:32:57.693270 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:57.693226 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dnmj9" podStartSLOduration=10.14592708 podStartE2EDuration="11.693212542s" podCreationTimestamp="2026-04-17 16:32:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:53.644343807 +0000 UTC m=+76.882859576" lastFinishedPulling="2026-04-17 16:32:55.191629267 +0000 UTC m=+78.430145038" observedRunningTime="2026-04-17 16:32:57.69219067 +0000 UTC m=+80.930706654" watchObservedRunningTime="2026-04-17 16:32:57.693212542 +0000 UTC m=+80.931728322" Apr 17 16:32:58.671039 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:32:58.671005 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:33:02.442512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:02.442473 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:33:11.012110 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:11.012075 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:33:11.012110 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:11.012119 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:33:17.595327 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:17.595290 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-29tlc" Apr 17 16:33:23.704318 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:23.704250 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" containerID="cri-o://7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675" gracePeriod=30 Apr 17 16:33:23.937424 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:23.937403 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:33:24.045421 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045400 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045574 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045452 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045574 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045475 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045574 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045493 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045724 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045625 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgzk9\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045724 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045693 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045827 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045727 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.045827 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045768 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca\") pod \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\" (UID: \"7f757ddb-4ea6-4bda-8b90-920ac807f2eb\") " Apr 17 16:33:24.046016 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.045986 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:24.046395 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.046332 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:24.048006 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.047978 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:24.048182 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.048157 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:24.048460 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.048425 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:24.048568 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.048501 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:24.048665 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.048642 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9" (OuterVolumeSpecName: "kube-api-access-fgzk9") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "kube-api-access-fgzk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:24.054958 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.054936 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7f757ddb-4ea6-4bda-8b90-920ac807f2eb" (UID: "7f757ddb-4ea6-4bda-8b90-920ac807f2eb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:33:24.146989 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.146957 2548 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-image-registry-private-configuration\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.146989 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.146984 2548 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-bound-sa-token\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147000 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-trusted-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147012 2548 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-tls\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147024 2548 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-installation-pull-secrets\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147038 2548 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-registry-certificates\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147051 2548 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-ca-trust-extracted\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.147170 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.147063 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fgzk9\" (UniqueName: \"kubernetes.io/projected/7f757ddb-4ea6-4bda-8b90-920ac807f2eb-kube-api-access-fgzk9\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:24.718142 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.718104 2548 generic.go:358] "Generic (PLEG): container finished" podID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerID="7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675" exitCode=0 Apr 17 16:33:24.718551 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.718160 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" Apr 17 16:33:24.718551 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.718168 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" event={"ID":"7f757ddb-4ea6-4bda-8b90-920ac807f2eb","Type":"ContainerDied","Data":"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675"} Apr 17 16:33:24.718551 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.718202 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69d75d96b8-v2j79" event={"ID":"7f757ddb-4ea6-4bda-8b90-920ac807f2eb","Type":"ContainerDied","Data":"20d159d1996a0e1614ccfa8b0bf3e0e6404e32e0ddadf06adecbaca0203dc4d7"} Apr 17 16:33:24.718551 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.718219 2548 scope.go:117] "RemoveContainer" containerID="7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675" Apr 17 16:33:24.726640 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.726617 2548 scope.go:117] "RemoveContainer" containerID="7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675" Apr 17 16:33:24.726923 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:33:24.726885 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675\": container with ID starting with 7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675 not found: ID does not exist" containerID="7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675" Apr 17 16:33:24.726997 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.726931 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675"} err="failed to get container status \"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675\": rpc error: code = NotFound desc = could not find container \"7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675\": container with ID starting with 7f18cf81ce9a8a508cc00e44dda7ba7bf21cd1d64bae6764209e8d795aca6675 not found: ID does not exist" Apr 17 16:33:24.738495 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.738471 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:33:24.743389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:24.743361 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69d75d96b8-v2j79"] Apr 17 16:33:25.325367 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:25.325317 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" path="/var/lib/kubelet/pods/7f757ddb-4ea6-4bda-8b90-920ac807f2eb/volumes" Apr 17 16:33:27.469714 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.469669 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-759c5d468-wd5lm" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerName="console" containerID="cri-o://1deb50839f6ca060902a2363b5b1e2336e4cc48fb2b48a64e4eb6c56473d08d5" gracePeriod=15 Apr 17 16:33:27.592493 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.592460 2548 patch_prober.go:28] interesting pod/console-759c5d468-wd5lm container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" start-of-body= Apr 17 16:33:27.592624 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.592530 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-759c5d468-wd5lm" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerName="console" probeResult="failure" output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" Apr 17 16:33:27.727747 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.727687 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-759c5d468-wd5lm_fbfca93e-1f1e-4866-9f9e-2417644d7ee4/console/0.log" Apr 17 16:33:27.727747 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.727733 2548 generic.go:358] "Generic (PLEG): container finished" podID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerID="1deb50839f6ca060902a2363b5b1e2336e4cc48fb2b48a64e4eb6c56473d08d5" exitCode=2 Apr 17 16:33:27.727920 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.727799 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c5d468-wd5lm" event={"ID":"fbfca93e-1f1e-4866-9f9e-2417644d7ee4","Type":"ContainerDied","Data":"1deb50839f6ca060902a2363b5b1e2336e4cc48fb2b48a64e4eb6c56473d08d5"} Apr 17 16:33:27.746383 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.746364 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-759c5d468-wd5lm_fbfca93e-1f1e-4866-9f9e-2417644d7ee4/console/0.log" Apr 17 16:33:27.746511 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.746431 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:33:27.775160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775139 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775290 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775173 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775290 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775229 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775290 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775265 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bh8p\" (UniqueName: \"kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775441 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775299 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775441 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775325 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775441 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775361 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config\") pod \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\" (UID: \"fbfca93e-1f1e-4866-9f9e-2417644d7ee4\") " Apr 17 16:33:27.775591 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775563 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config" (OuterVolumeSpecName: "console-config") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:27.775835 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775785 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:27.776047 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.775952 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:27.776661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.776634 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca" (OuterVolumeSpecName: "service-ca") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:27.778407 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.778383 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:27.778556 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.778533 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:27.778630 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.778582 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p" (OuterVolumeSpecName: "kube-api-access-5bh8p") pod "fbfca93e-1f1e-4866-9f9e-2417644d7ee4" (UID: "fbfca93e-1f1e-4866-9f9e-2417644d7ee4"). InnerVolumeSpecName "kube-api-access-5bh8p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:27.876054 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876030 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-oauth-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876054 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876054 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876196 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876064 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-service-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876196 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876072 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-trusted-ca-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876196 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876083 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bh8p\" (UniqueName: \"kubernetes.io/projected/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-kube-api-access-5bh8p\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876196 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876092 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-oauth-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:27.876196 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:27.876100 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfca93e-1f1e-4866-9f9e-2417644d7ee4-console-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:33:28.731578 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.731553 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-759c5d468-wd5lm_fbfca93e-1f1e-4866-9f9e-2417644d7ee4/console/0.log" Apr 17 16:33:28.732026 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.731629 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c5d468-wd5lm" event={"ID":"fbfca93e-1f1e-4866-9f9e-2417644d7ee4","Type":"ContainerDied","Data":"79123e9866e6565f8dc377fc7ec0e391b0d6e2a79497914445f1cf34d26288d6"} Apr 17 16:33:28.732026 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.731663 2548 scope.go:117] "RemoveContainer" containerID="1deb50839f6ca060902a2363b5b1e2336e4cc48fb2b48a64e4eb6c56473d08d5" Apr 17 16:33:28.732026 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.731678 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c5d468-wd5lm" Apr 17 16:33:28.751138 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.751117 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:33:28.759339 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:28.759313 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-759c5d468-wd5lm"] Apr 17 16:33:29.327773 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:29.326260 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" path="/var/lib/kubelet/pods/fbfca93e-1f1e-4866-9f9e-2417644d7ee4/volumes" Apr 17 16:33:31.017931 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:31.017885 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:33:31.021698 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:33:31.021680 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fbc7d488c-8tbzf" Apr 17 16:34:10.730939 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.730910 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45"] Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731159 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerName="console" Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731170 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerName="console" Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731186 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731191 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731239 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbfca93e-1f1e-4866-9f9e-2417644d7ee4" containerName="console" Apr 17 16:34:10.731397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.731248 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f757ddb-4ea6-4bda-8b90-920ac807f2eb" containerName="registry" Apr 17 16:34:10.734244 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.734228 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.739519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.739497 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 16:34:10.739659 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.739582 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jx52k\"" Apr 17 16:34:10.740005 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.739990 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 16:34:10.740092 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.740029 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 16:34:10.740092 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.740061 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 16:34:10.740192 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.740161 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 16:34:10.745096 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.745077 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 16:34:10.758047 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.758022 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45"] Apr 17 16:34:10.770675 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770657 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-metrics-client-ca\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770767 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770700 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770767 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770721 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770840 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770783 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-federate-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770840 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770805 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770840 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770836 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wsr\" (UniqueName: \"kubernetes.io/projected/029fd86d-6da2-4fcd-9f62-3a1d068b6866-kube-api-access-k2wsr\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770960 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770852 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-serving-certs-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.770960 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.770872 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871701 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871670 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-metrics-client-ca\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871713 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871734 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871757 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-federate-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871774 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871797 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wsr\" (UniqueName: \"kubernetes.io/projected/029fd86d-6da2-4fcd-9f62-3a1d068b6866-kube-api-access-k2wsr\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.871862 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871812 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-serving-certs-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.872220 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.871946 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.872536 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.872473 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-metrics-client-ca\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.872806 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.872753 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-serving-certs-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.873544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.873520 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.874446 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.874419 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.874553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.874448 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-federate-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.874553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.874528 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-telemeter-client-tls\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.874553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.874545 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/029fd86d-6da2-4fcd-9f62-3a1d068b6866-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:10.881833 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:10.881812 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wsr\" (UniqueName: \"kubernetes.io/projected/029fd86d-6da2-4fcd-9f62-3a1d068b6866-kube-api-access-k2wsr\") pod \"telemeter-client-5f9d4b86d6-kxw45\" (UID: \"029fd86d-6da2-4fcd-9f62-3a1d068b6866\") " pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:11.043247 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:11.043224 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" Apr 17 16:34:11.188192 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:11.188171 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45"] Apr 17 16:34:11.190038 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:34:11.190011 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029fd86d_6da2_4fcd_9f62_3a1d068b6866.slice/crio-444d2c6420980125ca423e248deec819ec8ad3464f83d6db45dd7a994896871a WatchSource:0}: Error finding container 444d2c6420980125ca423e248deec819ec8ad3464f83d6db45dd7a994896871a: Status 404 returned error can't find the container with id 444d2c6420980125ca423e248deec819ec8ad3464f83d6db45dd7a994896871a Apr 17 16:34:11.839886 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:11.839847 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" event={"ID":"029fd86d-6da2-4fcd-9f62-3a1d068b6866","Type":"ContainerStarted","Data":"444d2c6420980125ca423e248deec819ec8ad3464f83d6db45dd7a994896871a"} Apr 17 16:34:13.846863 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:13.846830 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" event={"ID":"029fd86d-6da2-4fcd-9f62-3a1d068b6866","Type":"ContainerStarted","Data":"89efee4592cb9c6b688396bf532750b2c7c9f185861ae16afaba84ea8498072a"} Apr 17 16:34:14.851877 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:14.851838 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" event={"ID":"029fd86d-6da2-4fcd-9f62-3a1d068b6866","Type":"ContainerStarted","Data":"ef807f21c0c86af229cbfa2646b0dd0385c822296d600ec5f64b8382ead0d338"} Apr 17 16:34:14.851877 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:14.851875 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" event={"ID":"029fd86d-6da2-4fcd-9f62-3a1d068b6866","Type":"ContainerStarted","Data":"3cdb5230bf4a207704346a3819daa6ce20e2902cc2921e3c676ce4f8c1077965"} Apr 17 16:34:14.874765 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:14.874713 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5f9d4b86d6-kxw45" podStartSLOduration=2.127716147 podStartE2EDuration="4.874700063s" podCreationTimestamp="2026-04-17 16:34:10 +0000 UTC" firstStartedPulling="2026-04-17 16:34:11.192070681 +0000 UTC m=+154.430586441" lastFinishedPulling="2026-04-17 16:34:13.939054584 +0000 UTC m=+157.177570357" observedRunningTime="2026-04-17 16:34:14.873349648 +0000 UTC m=+158.111865426" watchObservedRunningTime="2026-04-17 16:34:14.874700063 +0000 UTC m=+158.113215843" Apr 17 16:34:15.493134 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.493102 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:34:15.496219 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.496201 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.503725 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.503703 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:34:15.503861 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.503792 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:34:15.504706 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.504689 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:34:15.506265 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.506251 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4nh7g\"" Apr 17 16:34:15.506555 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.506541 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:34:15.506872 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.506859 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:34:15.511565 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.511545 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:34:15.518495 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.518476 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:34:15.606741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606715 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606744 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2kt\" (UniqueName: \"kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606766 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606785 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606825 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606848 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.606941 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.606874 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708003 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.707976 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708008 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708034 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708061 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708093 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708109 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2kt\" (UniqueName: \"kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708150 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708132 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.708733 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.708704 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.709024 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.709007 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.709105 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.709085 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.709143 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.709101 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.710655 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.710635 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.710791 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.710770 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.715305 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.715282 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2kt\" (UniqueName: \"kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt\") pod \"console-55ffffd85c-vk2fv\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.804944 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.804887 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:15.923513 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:15.923483 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:34:15.926268 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:34:15.926246 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a98a86_04d9_4790_a70b_8e68ef88690c.slice/crio-a7769cfd8963111c207817f470c3cc58066401f2f18c75335c3764d6b7b41f54 WatchSource:0}: Error finding container a7769cfd8963111c207817f470c3cc58066401f2f18c75335c3764d6b7b41f54: Status 404 returned error can't find the container with id a7769cfd8963111c207817f470c3cc58066401f2f18c75335c3764d6b7b41f54 Apr 17 16:34:16.859297 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:16.859258 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ffffd85c-vk2fv" event={"ID":"04a98a86-04d9-4790-a70b-8e68ef88690c","Type":"ContainerStarted","Data":"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690"} Apr 17 16:34:16.859297 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:16.859300 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ffffd85c-vk2fv" event={"ID":"04a98a86-04d9-4790-a70b-8e68ef88690c","Type":"ContainerStarted","Data":"a7769cfd8963111c207817f470c3cc58066401f2f18c75335c3764d6b7b41f54"} Apr 17 16:34:16.877934 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:16.877863 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55ffffd85c-vk2fv" podStartSLOduration=1.877846497 podStartE2EDuration="1.877846497s" podCreationTimestamp="2026-04-17 16:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:16.876581185 +0000 UTC m=+160.115096965" watchObservedRunningTime="2026-04-17 16:34:16.877846497 +0000 UTC m=+160.116362276" Apr 17 16:34:25.805266 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:25.805214 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:25.805266 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:25.805280 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:25.809957 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:25.809934 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:34:25.885178 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:34:25.885092 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:35:42.195003 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.194969 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:35:42.198101 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.198086 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.215320 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.215288 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:35:42.303017 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.302985 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97pz\" (UniqueName: \"kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303195 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303025 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303195 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303090 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303195 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303137 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303195 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303165 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303391 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303228 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.303391 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.303251 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404601 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404567 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404601 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404601 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404841 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404624 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p97pz\" (UniqueName: \"kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404841 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404642 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404841 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404677 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404841 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404708 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.404841 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.404739 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.405389 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.405365 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.405522 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.405496 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.405760 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.405534 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.405760 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.405715 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.407298 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.407267 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.407402 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.407359 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.413980 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.413887 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97pz\" (UniqueName: \"kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz\") pod \"console-c5488589d-m2r5x\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.507511 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.507428 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:42.627279 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:42.627244 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:35:42.630479 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:35:42.630451 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cea390_288f_4402_91ae_e319c9c34078.slice/crio-c9a7e5e2e5192a52067be04ce0b791e153f24317347ceb809010a157645dd5be WatchSource:0}: Error finding container c9a7e5e2e5192a52067be04ce0b791e153f24317347ceb809010a157645dd5be: Status 404 returned error can't find the container with id c9a7e5e2e5192a52067be04ce0b791e153f24317347ceb809010a157645dd5be Apr 17 16:35:43.078994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:43.078955 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5488589d-m2r5x" event={"ID":"55cea390-288f-4402-91ae-e319c9c34078","Type":"ContainerStarted","Data":"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c"} Apr 17 16:35:43.078994 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:43.078993 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5488589d-m2r5x" event={"ID":"55cea390-288f-4402-91ae-e319c9c34078","Type":"ContainerStarted","Data":"c9a7e5e2e5192a52067be04ce0b791e153f24317347ceb809010a157645dd5be"} Apr 17 16:35:43.100044 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:43.099982 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5488589d-m2r5x" podStartSLOduration=1.09996387 podStartE2EDuration="1.09996387s" podCreationTimestamp="2026-04-17 16:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:43.098208865 +0000 UTC m=+246.336724655" watchObservedRunningTime="2026-04-17 16:35:43.09996387 +0000 UTC m=+246.338479652" Apr 17 16:35:52.507833 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:52.507799 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:52.507833 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:52.507834 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:52.512698 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:52.512676 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:53.108795 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:53.108766 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:35:53.162650 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:35:53.162619 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:36:18.184707 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.184654 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55ffffd85c-vk2fv" podUID="04a98a86-04d9-4790-a70b-8e68ef88690c" containerName="console" containerID="cri-o://b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690" gracePeriod=15 Apr 17 16:36:18.416153 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.416122 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55ffffd85c-vk2fv_04a98a86-04d9-4790-a70b-8e68ef88690c/console/0.log" Apr 17 16:36:18.416269 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.416196 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:36:18.578129 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578098 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578324 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578162 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578324 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578198 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578324 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578307 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578490 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578339 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578490 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578382 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578591 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578500 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j2kt\" (UniqueName: \"kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt\") pod \"04a98a86-04d9-4790-a70b-8e68ef88690c\" (UID: \"04a98a86-04d9-4790-a70b-8e68ef88690c\") " Apr 17 16:36:18.578679 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578641 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca" (OuterVolumeSpecName: "service-ca") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:18.578755 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578703 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:18.578755 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578647 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config" (OuterVolumeSpecName: "console-config") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:18.578755 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578728 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:18.578916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578776 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-service-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.578916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578796 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-console-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.578916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.578810 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-oauth-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.580520 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.580491 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:18.580975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.580518 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:18.580975 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.580792 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt" (OuterVolumeSpecName: "kube-api-access-5j2kt") pod "04a98a86-04d9-4790-a70b-8e68ef88690c" (UID: "04a98a86-04d9-4790-a70b-8e68ef88690c"). InnerVolumeSpecName "kube-api-access-5j2kt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:36:18.679666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.679633 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.679666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.679662 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a98a86-04d9-4790-a70b-8e68ef88690c-console-oauth-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.679666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.679672 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a98a86-04d9-4790-a70b-8e68ef88690c-trusted-ca-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:18.679879 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:18.679680 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5j2kt\" (UniqueName: \"kubernetes.io/projected/04a98a86-04d9-4790-a70b-8e68ef88690c-kube-api-access-5j2kt\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:19.175672 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175645 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55ffffd85c-vk2fv_04a98a86-04d9-4790-a70b-8e68ef88690c/console/0.log" Apr 17 16:36:19.175977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175684 2548 generic.go:358] "Generic (PLEG): container finished" podID="04a98a86-04d9-4790-a70b-8e68ef88690c" containerID="b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690" exitCode=2 Apr 17 16:36:19.175977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175718 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ffffd85c-vk2fv" event={"ID":"04a98a86-04d9-4790-a70b-8e68ef88690c","Type":"ContainerDied","Data":"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690"} Apr 17 16:36:19.175977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175763 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ffffd85c-vk2fv" event={"ID":"04a98a86-04d9-4790-a70b-8e68ef88690c","Type":"ContainerDied","Data":"a7769cfd8963111c207817f470c3cc58066401f2f18c75335c3764d6b7b41f54"} Apr 17 16:36:19.175977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175770 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ffffd85c-vk2fv" Apr 17 16:36:19.175977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.175783 2548 scope.go:117] "RemoveContainer" containerID="b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690" Apr 17 16:36:19.185705 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.185573 2548 scope.go:117] "RemoveContainer" containerID="b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690" Apr 17 16:36:19.185965 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:36:19.185849 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690\": container with ID starting with b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690 not found: ID does not exist" containerID="b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690" Apr 17 16:36:19.185965 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.185873 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690"} err="failed to get container status \"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690\": rpc error: code = NotFound desc = could not find container \"b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690\": container with ID starting with b63780306197a570c6da07fd3a41be77caef50e5639d3c05d436c5786b7bd690 not found: ID does not exist" Apr 17 16:36:19.197804 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.197783 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:36:19.202858 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.202837 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55ffffd85c-vk2fv"] Apr 17 16:36:19.325570 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:19.325528 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a98a86-04d9-4790-a70b-8e68ef88690c" path="/var/lib/kubelet/pods/04a98a86-04d9-4790-a70b-8e68ef88690c/volumes" Apr 17 16:36:35.470647 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.470617 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh"] Apr 17 16:36:35.471107 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.470854 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a98a86-04d9-4790-a70b-8e68ef88690c" containerName="console" Apr 17 16:36:35.471107 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.470864 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a98a86-04d9-4790-a70b-8e68ef88690c" containerName="console" Apr 17 16:36:35.471107 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.470924 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a98a86-04d9-4790-a70b-8e68ef88690c" containerName="console" Apr 17 16:36:35.473756 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.473739 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.476158 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.476130 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:36:35.476158 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.476156 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rnshp\"" Apr 17 16:36:35.476327 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.476124 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:36:35.481951 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.481929 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh"] Apr 17 16:36:35.503334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.503306 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.503442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.503337 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.503442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.503413 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-584cx\" (UniqueName: \"kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.604067 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.604041 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-584cx\" (UniqueName: \"kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.604168 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.604092 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.604168 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.604123 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.604496 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.604476 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.604553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.604502 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.612459 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.612432 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-584cx\" (UniqueName: \"kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.783760 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.783734 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:35.901924 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:35.901872 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh"] Apr 17 16:36:35.905683 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:36:35.905656 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeca68ff_1d11_4025_baad_e37e37e00c26.slice/crio-a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb WatchSource:0}: Error finding container a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb: Status 404 returned error can't find the container with id a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb Apr 17 16:36:36.220943 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:36.220852 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerStarted","Data":"a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb"} Apr 17 16:36:37.198144 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:37.198116 2548 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:36:42.238119 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:42.238031 2548 generic.go:358] "Generic (PLEG): container finished" podID="deca68ff-1d11-4025-baad-e37e37e00c26" containerID="6be8a92e59e444184a7175fe43d9603a1f14cb6cb85baf3ae6ea90df1788de2f" exitCode=0 Apr 17 16:36:42.238481 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:42.238118 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerDied","Data":"6be8a92e59e444184a7175fe43d9603a1f14cb6cb85baf3ae6ea90df1788de2f"} Apr 17 16:36:42.243130 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:42.243111 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:36:44.245072 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:44.244984 2548 generic.go:358] "Generic (PLEG): container finished" podID="deca68ff-1d11-4025-baad-e37e37e00c26" containerID="0bfadfe3a5db4424afc86e25948e66dfb2648135f07a274a9d74f2a1468cb2db" exitCode=0 Apr 17 16:36:44.245072 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:44.245030 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerDied","Data":"0bfadfe3a5db4424afc86e25948e66dfb2648135f07a274a9d74f2a1468cb2db"} Apr 17 16:36:51.264825 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:51.264794 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerStarted","Data":"ec6c19918e0286c45fe5fa04a1d078918670b95b3f372969a83006b67364888f"} Apr 17 16:36:51.282387 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:51.282336 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" podStartSLOduration=1.020289748 podStartE2EDuration="16.282320097s" podCreationTimestamp="2026-04-17 16:36:35 +0000 UTC" firstStartedPulling="2026-04-17 16:36:35.907519118 +0000 UTC m=+299.146034882" lastFinishedPulling="2026-04-17 16:36:51.169549473 +0000 UTC m=+314.408065231" observedRunningTime="2026-04-17 16:36:51.282266946 +0000 UTC m=+314.520782726" watchObservedRunningTime="2026-04-17 16:36:51.282320097 +0000 UTC m=+314.520835876" Apr 17 16:36:52.269228 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:52.269194 2548 generic.go:358] "Generic (PLEG): container finished" podID="deca68ff-1d11-4025-baad-e37e37e00c26" containerID="ec6c19918e0286c45fe5fa04a1d078918670b95b3f372969a83006b67364888f" exitCode=0 Apr 17 16:36:52.269609 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:52.269290 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerDied","Data":"ec6c19918e0286c45fe5fa04a1d078918670b95b3f372969a83006b67364888f"} Apr 17 16:36:53.389049 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.389028 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:53.428936 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.428886 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle\") pod \"deca68ff-1d11-4025-baad-e37e37e00c26\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " Apr 17 16:36:53.429079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.428970 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-584cx\" (UniqueName: \"kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx\") pod \"deca68ff-1d11-4025-baad-e37e37e00c26\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " Apr 17 16:36:53.429079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.428989 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util\") pod \"deca68ff-1d11-4025-baad-e37e37e00c26\" (UID: \"deca68ff-1d11-4025-baad-e37e37e00c26\") " Apr 17 16:36:53.429518 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.429492 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle" (OuterVolumeSpecName: "bundle") pod "deca68ff-1d11-4025-baad-e37e37e00c26" (UID: "deca68ff-1d11-4025-baad-e37e37e00c26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:36:53.431218 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.431189 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx" (OuterVolumeSpecName: "kube-api-access-584cx") pod "deca68ff-1d11-4025-baad-e37e37e00c26" (UID: "deca68ff-1d11-4025-baad-e37e37e00c26"). InnerVolumeSpecName "kube-api-access-584cx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:36:53.433066 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.433041 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util" (OuterVolumeSpecName: "util") pod "deca68ff-1d11-4025-baad-e37e37e00c26" (UID: "deca68ff-1d11-4025-baad-e37e37e00c26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:36:53.529848 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.529768 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-584cx\" (UniqueName: \"kubernetes.io/projected/deca68ff-1d11-4025-baad-e37e37e00c26-kube-api-access-584cx\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:53.529848 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.529801 2548 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-util\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:53.529848 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:53.529815 2548 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/deca68ff-1d11-4025-baad-e37e37e00c26-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:36:54.275981 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:54.275946 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" event={"ID":"deca68ff-1d11-4025-baad-e37e37e00c26","Type":"ContainerDied","Data":"a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb"} Apr 17 16:36:54.275981 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:54.275980 2548 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a33fe12c0a5d66b88489094b80c7619429edc02a075f6b06a18dc42abbcfeeeb" Apr 17 16:36:54.276147 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:54.276016 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cntjvh" Apr 17 16:36:57.291116 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291084 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7"] Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291348 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="util" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291359 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="util" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291368 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="extract" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291373 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="extract" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291390 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="pull" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291396 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="pull" Apr 17 16:36:57.291544 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.291435 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="deca68ff-1d11-4025-baad-e37e37e00c26" containerName="extract" Apr 17 16:36:57.313968 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.313939 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7"] Apr 17 16:36:57.314127 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.314067 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.316587 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.316498 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 16:36:57.316587 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.316560 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 16:36:57.316786 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.316599 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 16:36:57.316786 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.316565 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lrjsm\"" Apr 17 16:36:57.355962 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.355935 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5ba1d964-bd48-401e-a523-debb4d1eb17a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.356056 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.356035 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdpn\" (UniqueName: \"kubernetes.io/projected/5ba1d964-bd48-401e-a523-debb4d1eb17a-kube-api-access-6mdpn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.456741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.456714 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdpn\" (UniqueName: \"kubernetes.io/projected/5ba1d964-bd48-401e-a523-debb4d1eb17a-kube-api-access-6mdpn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.456864 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.456762 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5ba1d964-bd48-401e-a523-debb4d1eb17a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.459124 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.459104 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/5ba1d964-bd48-401e-a523-debb4d1eb17a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.468478 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.468459 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdpn\" (UniqueName: \"kubernetes.io/projected/5ba1d964-bd48-401e-a523-debb4d1eb17a-kube-api-access-6mdpn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7\" (UID: \"5ba1d964-bd48-401e-a523-debb4d1eb17a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.626469 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.626404 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:36:57.755128 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:57.755069 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7"] Apr 17 16:36:57.760648 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:36:57.760617 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba1d964_bd48_401e_a523_debb4d1eb17a.slice/crio-7630365bc09ad5b7619306ebef1a546cdfa8f02c3e90282d7a70d7bc5aa1e784 WatchSource:0}: Error finding container 7630365bc09ad5b7619306ebef1a546cdfa8f02c3e90282d7a70d7bc5aa1e784: Status 404 returned error can't find the container with id 7630365bc09ad5b7619306ebef1a546cdfa8f02c3e90282d7a70d7bc5aa1e784 Apr 17 16:36:58.289187 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:36:58.289144 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" event={"ID":"5ba1d964-bd48-401e-a523-debb4d1eb17a","Type":"ContainerStarted","Data":"7630365bc09ad5b7619306ebef1a546cdfa8f02c3e90282d7a70d7bc5aa1e784"} Apr 17 16:37:01.300003 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.299972 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" event={"ID":"5ba1d964-bd48-401e-a523-debb4d1eb17a","Type":"ContainerStarted","Data":"3d93c648c7dfc442305901a912d2ff4045211327aff84a5adbd43afa01e4adb5"} Apr 17 16:37:01.300328 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.300081 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:37:01.323873 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.323822 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" podStartSLOduration=1.156775791 podStartE2EDuration="4.323807475s" podCreationTimestamp="2026-04-17 16:36:57 +0000 UTC" firstStartedPulling="2026-04-17 16:36:57.763124815 +0000 UTC m=+321.001640583" lastFinishedPulling="2026-04-17 16:37:00.930156507 +0000 UTC m=+324.168672267" observedRunningTime="2026-04-17 16:37:01.321428627 +0000 UTC m=+324.559944406" watchObservedRunningTime="2026-04-17 16:37:01.323807475 +0000 UTC m=+324.562323255" Apr 17 16:37:01.483171 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.483144 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4rlk5"] Apr 17 16:37:01.485833 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.485809 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.488482 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.488427 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-z2dk4\"" Apr 17 16:37:01.488603 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.488587 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 16:37:01.488865 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.488850 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 16:37:01.498336 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.498304 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4rlk5"] Apr 17 16:37:01.587254 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.587186 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.587254 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.587248 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/594b308b-a6cb-42ef-86c8-ab4e13b6b106-cabundle0\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.587419 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.587310 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52qr\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-kube-api-access-s52qr\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.688365 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.688317 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s52qr\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-kube-api-access-s52qr\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.688553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.688388 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.688553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.688431 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/594b308b-a6cb-42ef-86c8-ab4e13b6b106-cabundle0\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.688553 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.688527 2548 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:37:01.688553 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.688548 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:37:01.688769 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.688558 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4rlk5: references non-existent secret key: ca.crt Apr 17 16:37:01.688769 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.688608 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates podName:594b308b-a6cb-42ef-86c8-ab4e13b6b106 nodeName:}" failed. No retries permitted until 2026-04-17 16:37:02.188593609 +0000 UTC m=+325.427109366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates") pod "keda-operator-ffbb595cb-4rlk5" (UID: "594b308b-a6cb-42ef-86c8-ab4e13b6b106") : references non-existent secret key: ca.crt Apr 17 16:37:01.689168 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.689144 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/594b308b-a6cb-42ef-86c8-ab4e13b6b106-cabundle0\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.696757 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.696736 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52qr\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-kube-api-access-s52qr\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:01.788564 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.788540 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz"] Apr 17 16:37:01.791824 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.791810 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.800180 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.800161 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 16:37:01.806327 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.806307 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz"] Apr 17 16:37:01.889921 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.889841 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.889921 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.889877 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgth\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-kube-api-access-ncgth\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.890066 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.889963 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0f318dd9-5e32-430f-af77-cc49df24ed9e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.990918 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.990851 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.990918 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.990929 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgth\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-kube-api-access-ncgth\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.990974 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0f318dd9-5e32-430f-af77-cc49df24ed9e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.990993 2548 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.991013 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.991028 2548 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.991044 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:37:01.991136 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:01.991107 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates podName:0f318dd9-5e32-430f-af77-cc49df24ed9e nodeName:}" failed. No retries permitted until 2026-04-17 16:37:02.491092327 +0000 UTC m=+325.729608084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates") pod "keda-metrics-apiserver-7c9f485588-tc7mz" (UID: "0f318dd9-5e32-430f-af77-cc49df24ed9e") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:37:01.991334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:01.991309 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0f318dd9-5e32-430f-af77-cc49df24ed9e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:02.002332 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.002307 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgth\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-kube-api-access-ncgth\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:02.151821 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.151746 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nqb8d"] Apr 17 16:37:02.154995 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.154979 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.157316 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.157295 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 16:37:02.166130 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.166107 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nqb8d"] Apr 17 16:37:02.192662 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.192629 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxmr\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-kube-api-access-tzxmr\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.192770 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.192682 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-certificates\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.192770 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.192735 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:02.192857 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.192825 2548 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:37:02.192857 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.192838 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:37:02.192857 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.192846 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4rlk5: references non-existent secret key: ca.crt Apr 17 16:37:02.192984 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.192890 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates podName:594b308b-a6cb-42ef-86c8-ab4e13b6b106 nodeName:}" failed. No retries permitted until 2026-04-17 16:37:03.192876521 +0000 UTC m=+326.431392279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates") pod "keda-operator-ffbb595cb-4rlk5" (UID: "594b308b-a6cb-42ef-86c8-ab4e13b6b106") : references non-existent secret key: ca.crt Apr 17 16:37:02.293519 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.293479 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxmr\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-kube-api-access-tzxmr\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.293702 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.293535 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-certificates\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.296240 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.296216 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-certificates\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.302568 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.302530 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxmr\" (UniqueName: \"kubernetes.io/projected/475f7bb3-7549-45b8-9770-aefe7ed44b1b-kube-api-access-tzxmr\") pod \"keda-admission-cf49989db-nqb8d\" (UID: \"475f7bb3-7549-45b8-9770-aefe7ed44b1b\") " pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.465388 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.465314 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:02.495648 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.495620 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:02.495789 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.495771 2548 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:37:02.495856 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.495795 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:37:02.495856 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.495820 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz: references non-existent secret key: tls.crt Apr 17 16:37:02.495967 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:02.495912 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates podName:0f318dd9-5e32-430f-af77-cc49df24ed9e nodeName:}" failed. No retries permitted until 2026-04-17 16:37:03.495873227 +0000 UTC m=+326.734389004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates") pod "keda-metrics-apiserver-7c9f485588-tc7mz" (UID: "0f318dd9-5e32-430f-af77-cc49df24ed9e") : references non-existent secret key: tls.crt Apr 17 16:37:02.591165 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:02.591135 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nqb8d"] Apr 17 16:37:02.594317 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:37:02.594292 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475f7bb3_7549_45b8_9770_aefe7ed44b1b.slice/crio-3bacbf38f233f7181a01c81109be94e17d6621492df252a0685a2df6c1189da1 WatchSource:0}: Error finding container 3bacbf38f233f7181a01c81109be94e17d6621492df252a0685a2df6c1189da1: Status 404 returned error can't find the container with id 3bacbf38f233f7181a01c81109be94e17d6621492df252a0685a2df6c1189da1 Apr 17 16:37:03.200789 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:03.200753 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:03.201079 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.201039 2548 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:37:03.201163 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.201120 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:37:03.201163 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.201134 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4rlk5: references non-existent secret key: ca.crt Apr 17 16:37:03.201233 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.201203 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates podName:594b308b-a6cb-42ef-86c8-ab4e13b6b106 nodeName:}" failed. No retries permitted until 2026-04-17 16:37:05.201182085 +0000 UTC m=+328.439697856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates") pod "keda-operator-ffbb595cb-4rlk5" (UID: "594b308b-a6cb-42ef-86c8-ab4e13b6b106") : references non-existent secret key: ca.crt Apr 17 16:37:03.308068 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:03.308023 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nqb8d" event={"ID":"475f7bb3-7549-45b8-9770-aefe7ed44b1b","Type":"ContainerStarted","Data":"3bacbf38f233f7181a01c81109be94e17d6621492df252a0685a2df6c1189da1"} Apr 17 16:37:03.504326 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:03.504242 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:03.504452 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.504403 2548 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:37:03.504452 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.504426 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:37:03.504452 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.504448 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz: references non-existent secret key: tls.crt Apr 17 16:37:03.504552 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:37:03.504513 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates podName:0f318dd9-5e32-430f-af77-cc49df24ed9e nodeName:}" failed. No retries permitted until 2026-04-17 16:37:05.50449424 +0000 UTC m=+328.743010003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates") pod "keda-metrics-apiserver-7c9f485588-tc7mz" (UID: "0f318dd9-5e32-430f-af77-cc49df24ed9e") : references non-existent secret key: tls.crt Apr 17 16:37:04.312460 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:04.312424 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nqb8d" event={"ID":"475f7bb3-7549-45b8-9770-aefe7ed44b1b","Type":"ContainerStarted","Data":"de47ffa2f22dc76e1b4be619ccf159ed08fde0fd299042d5ad9a6c0a40854695"} Apr 17 16:37:04.312916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:04.312580 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:04.329713 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:04.329673 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nqb8d" podStartSLOduration=1.076603221 podStartE2EDuration="2.329659923s" podCreationTimestamp="2026-04-17 16:37:02 +0000 UTC" firstStartedPulling="2026-04-17 16:37:02.595583908 +0000 UTC m=+325.834099669" lastFinishedPulling="2026-04-17 16:37:03.848640615 +0000 UTC m=+327.087156371" observedRunningTime="2026-04-17 16:37:04.328868505 +0000 UTC m=+327.567384284" watchObservedRunningTime="2026-04-17 16:37:04.329659923 +0000 UTC m=+327.568175702" Apr 17 16:37:05.218800 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.218767 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:05.221358 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.221326 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/594b308b-a6cb-42ef-86c8-ab4e13b6b106-certificates\") pod \"keda-operator-ffbb595cb-4rlk5\" (UID: \"594b308b-a6cb-42ef-86c8-ab4e13b6b106\") " pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:05.395338 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.395306 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:05.513976 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.513949 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4rlk5"] Apr 17 16:37:05.516114 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:37:05.516089 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod594b308b_a6cb_42ef_86c8_ab4e13b6b106.slice/crio-dfe0d77098508049b548001de91f40ed5e03588b70f9a6d9aba87503a06cc9bd WatchSource:0}: Error finding container dfe0d77098508049b548001de91f40ed5e03588b70f9a6d9aba87503a06cc9bd: Status 404 returned error can't find the container with id dfe0d77098508049b548001de91f40ed5e03588b70f9a6d9aba87503a06cc9bd Apr 17 16:37:05.521189 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.521165 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:05.523712 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.523688 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0f318dd9-5e32-430f-af77-cc49df24ed9e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tc7mz\" (UID: \"0f318dd9-5e32-430f-af77-cc49df24ed9e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:05.701342 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.701316 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:05.819485 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:05.819449 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz"] Apr 17 16:37:05.822540 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:37:05.822511 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f318dd9_5e32_430f_af77_cc49df24ed9e.slice/crio-1d730f02085c9fda27ae7c6d9f3a3abb4b62c4f1284cb858539e78439ae6107d WatchSource:0}: Error finding container 1d730f02085c9fda27ae7c6d9f3a3abb4b62c4f1284cb858539e78439ae6107d: Status 404 returned error can't find the container with id 1d730f02085c9fda27ae7c6d9f3a3abb4b62c4f1284cb858539e78439ae6107d Apr 17 16:37:06.319291 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:06.319255 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" event={"ID":"594b308b-a6cb-42ef-86c8-ab4e13b6b106","Type":"ContainerStarted","Data":"dfe0d77098508049b548001de91f40ed5e03588b70f9a6d9aba87503a06cc9bd"} Apr 17 16:37:06.320368 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:06.320342 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" event={"ID":"0f318dd9-5e32-430f-af77-cc49df24ed9e","Type":"ContainerStarted","Data":"1d730f02085c9fda27ae7c6d9f3a3abb4b62c4f1284cb858539e78439ae6107d"} Apr 17 16:37:09.330040 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:09.330004 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" event={"ID":"0f318dd9-5e32-430f-af77-cc49df24ed9e","Type":"ContainerStarted","Data":"48d47f9c1ecd316512dccbddc805442b4c14a9eef5e68aadc63a753e66b24b12"} Apr 17 16:37:09.330432 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:09.330220 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:09.347632 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:09.347518 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" podStartSLOduration=5.716511856 podStartE2EDuration="8.34750397s" podCreationTimestamp="2026-04-17 16:37:01 +0000 UTC" firstStartedPulling="2026-04-17 16:37:05.824034082 +0000 UTC m=+329.062549849" lastFinishedPulling="2026-04-17 16:37:08.455026201 +0000 UTC m=+331.693541963" observedRunningTime="2026-04-17 16:37:09.346685975 +0000 UTC m=+332.585201754" watchObservedRunningTime="2026-04-17 16:37:09.34750397 +0000 UTC m=+332.586019750" Apr 17 16:37:15.352110 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:15.352066 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" event={"ID":"594b308b-a6cb-42ef-86c8-ab4e13b6b106","Type":"ContainerStarted","Data":"1c2bc0ab3f87b8100c9c4c50a6e31c6d829b066fa36a27cc981c42fc0303584b"} Apr 17 16:37:15.352478 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:15.352305 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:37:15.368454 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:15.368409 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" podStartSLOduration=4.861805882 podStartE2EDuration="14.36839783s" podCreationTimestamp="2026-04-17 16:37:01 +0000 UTC" firstStartedPulling="2026-04-17 16:37:05.517432933 +0000 UTC m=+328.755948692" lastFinishedPulling="2026-04-17 16:37:15.024024879 +0000 UTC m=+338.262540640" observedRunningTime="2026-04-17 16:37:15.367216559 +0000 UTC m=+338.605732338" watchObservedRunningTime="2026-04-17 16:37:15.36839783 +0000 UTC m=+338.606913609" Apr 17 16:37:20.339064 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:20.339037 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tc7mz" Apr 17 16:37:22.305496 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:22.305467 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-k5bd7" Apr 17 16:37:25.317675 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:25.317642 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nqb8d" Apr 17 16:37:36.357109 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:37:36.357038 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-4rlk5" Apr 17 16:38:10.821157 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.821126 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:10.824218 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.824200 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:10.827280 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.827262 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 16:38:10.827364 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.827331 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:38:10.827763 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.827744 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:38:10.828280 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.828267 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-g84d2\"" Apr 17 16:38:10.832316 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.832294 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-449rq"] Apr 17 16:38:10.835263 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.835246 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:10.836195 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.836174 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:10.837361 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.837343 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-h6lqd\"" Apr 17 16:38:10.837454 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.837385 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 16:38:10.845152 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.845129 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-449rq"] Apr 17 16:38:10.861752 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.861733 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-skv55"] Apr 17 16:38:10.866807 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.866786 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:10.869424 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.869395 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bm2wd\"" Apr 17 16:38:10.869825 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.869811 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:38:10.875579 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.875561 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-skv55"] Apr 17 16:38:10.897520 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897502 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e91b1f8-9725-4761-b726-d5e59ab3c67c-data\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:10.897611 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897535 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:10.897611 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897552 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgg7j\" (UniqueName: \"kubernetes.io/projected/50ba178c-7c75-47c1-bd38-740ebeecf1fc-kube-api-access-vgg7j\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:10.897611 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897571 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4kw\" (UniqueName: \"kubernetes.io/projected/9e91b1f8-9725-4761-b726-d5e59ab3c67c-kube-api-access-sd4kw\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:10.897611 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897594 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:10.897778 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.897642 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85nng\" (UniqueName: \"kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:10.997962 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.997933 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85nng\" (UniqueName: \"kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:10.998079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.997986 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e91b1f8-9725-4761-b726-d5e59ab3c67c-data\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:10.998079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.998030 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:10.998079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.998053 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgg7j\" (UniqueName: \"kubernetes.io/projected/50ba178c-7c75-47c1-bd38-740ebeecf1fc-kube-api-access-vgg7j\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:10.998232 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.998080 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4kw\" (UniqueName: \"kubernetes.io/projected/9e91b1f8-9725-4761-b726-d5e59ab3c67c-kube-api-access-sd4kw\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:10.998232 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.998102 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:10.998232 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:38:10.998175 2548 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 16:38:10.998232 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:38:10.998195 2548 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 16:38:10.998412 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:38:10.998258 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert podName:06646c4e-8609-49b7-a255-41cbfb0ead68 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:11.498237811 +0000 UTC m=+394.736753577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert") pod "kserve-controller-manager-85bb65f8c4-d8wsl" (UID: "06646c4e-8609-49b7-a255-41cbfb0ead68") : secret "kserve-webhook-server-cert" not found Apr 17 16:38:10.998412 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:38:10.998278 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert podName:50ba178c-7c75-47c1-bd38-740ebeecf1fc nodeName:}" failed. No retries permitted until 2026-04-17 16:38:11.498269829 +0000 UTC m=+394.736785586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert") pod "llmisvc-controller-manager-68cc5db7c4-449rq" (UID: "50ba178c-7c75-47c1-bd38-740ebeecf1fc") : secret "llmisvc-webhook-server-cert" not found Apr 17 16:38:10.998486 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:10.998418 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e91b1f8-9725-4761-b726-d5e59ab3c67c-data\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:11.008847 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.008826 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4kw\" (UniqueName: \"kubernetes.io/projected/9e91b1f8-9725-4761-b726-d5e59ab3c67c-kube-api-access-sd4kw\") pod \"seaweedfs-86cc847c5c-skv55\" (UID: \"9e91b1f8-9725-4761-b726-d5e59ab3c67c\") " pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:11.009014 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.008998 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgg7j\" (UniqueName: \"kubernetes.io/projected/50ba178c-7c75-47c1-bd38-740ebeecf1fc-kube-api-access-vgg7j\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:11.009160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.009139 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85nng\" (UniqueName: \"kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:11.177505 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.177422 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:11.299295 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.299268 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-skv55"] Apr 17 16:38:11.301819 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:38:11.301794 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e91b1f8_9725_4761_b726_d5e59ab3c67c.slice/crio-7af7fa935cdf4a659dbe508ee63e441e2c136ef36489f570ed3f3bac05200f46 WatchSource:0}: Error finding container 7af7fa935cdf4a659dbe508ee63e441e2c136ef36489f570ed3f3bac05200f46: Status 404 returned error can't find the container with id 7af7fa935cdf4a659dbe508ee63e441e2c136ef36489f570ed3f3bac05200f46 Apr 17 16:38:11.502113 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.502036 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:11.502113 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.502068 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:11.504588 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.504565 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ba178c-7c75-47c1-bd38-740ebeecf1fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-449rq\" (UID: \"50ba178c-7c75-47c1-bd38-740ebeecf1fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:11.504694 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.504601 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") pod \"kserve-controller-manager-85bb65f8c4-d8wsl\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:11.514685 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.514657 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-skv55" event={"ID":"9e91b1f8-9725-4761-b726-d5e59ab3c67c","Type":"ContainerStarted","Data":"7af7fa935cdf4a659dbe508ee63e441e2c136ef36489f570ed3f3bac05200f46"} Apr 17 16:38:11.734062 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.734030 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:11.745726 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.745701 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:11.890375 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.890333 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:11.909792 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:11.909764 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-449rq"] Apr 17 16:38:11.951535 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:38:11.951498 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06646c4e_8609_49b7_a255_41cbfb0ead68.slice/crio-07985b6ec71dbd4042f0f5b9a729ee9af8e9237f2ec2c6b20f58a0a2c09f1fb0 WatchSource:0}: Error finding container 07985b6ec71dbd4042f0f5b9a729ee9af8e9237f2ec2c6b20f58a0a2c09f1fb0: Status 404 returned error can't find the container with id 07985b6ec71dbd4042f0f5b9a729ee9af8e9237f2ec2c6b20f58a0a2c09f1fb0 Apr 17 16:38:11.952054 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:38:11.952022 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod50ba178c_7c75_47c1_bd38_740ebeecf1fc.slice/crio-c249fdfa72d8e5b7d7055f3620c2bf01a39a8e3f758f9c8387d3096bae198142 WatchSource:0}: Error finding container c249fdfa72d8e5b7d7055f3620c2bf01a39a8e3f758f9c8387d3096bae198142: Status 404 returned error can't find the container with id c249fdfa72d8e5b7d7055f3620c2bf01a39a8e3f758f9c8387d3096bae198142 Apr 17 16:38:12.519236 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:12.519201 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" event={"ID":"50ba178c-7c75-47c1-bd38-740ebeecf1fc","Type":"ContainerStarted","Data":"c249fdfa72d8e5b7d7055f3620c2bf01a39a8e3f758f9c8387d3096bae198142"} Apr 17 16:38:12.520407 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:12.520379 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" event={"ID":"06646c4e-8609-49b7-a255-41cbfb0ead68","Type":"ContainerStarted","Data":"07985b6ec71dbd4042f0f5b9a729ee9af8e9237f2ec2c6b20f58a0a2c09f1fb0"} Apr 17 16:38:16.534437 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.534400 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-skv55" event={"ID":"9e91b1f8-9725-4761-b726-d5e59ab3c67c","Type":"ContainerStarted","Data":"7848f6d054f2c72abac47c66ab74322336a5efa6f997d5b5d1d9bbf240479a97"} Apr 17 16:38:16.534872 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.534609 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:16.535811 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.535788 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" event={"ID":"06646c4e-8609-49b7-a255-41cbfb0ead68","Type":"ContainerStarted","Data":"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7"} Apr 17 16:38:16.535951 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.535878 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:16.537006 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.536984 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" event={"ID":"50ba178c-7c75-47c1-bd38-740ebeecf1fc","Type":"ContainerStarted","Data":"0c728ab45ae490740836d490eeb8482311d905d01b4293b6dab38e991b7fefb7"} Apr 17 16:38:16.537076 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.537034 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:16.563553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.563512 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-skv55" podStartSLOduration=1.521123168 podStartE2EDuration="6.563498928s" podCreationTimestamp="2026-04-17 16:38:10 +0000 UTC" firstStartedPulling="2026-04-17 16:38:11.303206691 +0000 UTC m=+394.541722466" lastFinishedPulling="2026-04-17 16:38:16.345582456 +0000 UTC m=+399.584098226" observedRunningTime="2026-04-17 16:38:16.562081699 +0000 UTC m=+399.800597477" watchObservedRunningTime="2026-04-17 16:38:16.563498928 +0000 UTC m=+399.802014707" Apr 17 16:38:16.597513 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.597425 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" podStartSLOduration=2.263117264 podStartE2EDuration="6.597412926s" podCreationTimestamp="2026-04-17 16:38:10 +0000 UTC" firstStartedPulling="2026-04-17 16:38:11.953423559 +0000 UTC m=+395.191939317" lastFinishedPulling="2026-04-17 16:38:16.28771922 +0000 UTC m=+399.526234979" observedRunningTime="2026-04-17 16:38:16.594518263 +0000 UTC m=+399.833034042" watchObservedRunningTime="2026-04-17 16:38:16.597412926 +0000 UTC m=+399.835928738" Apr 17 16:38:16.631586 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:16.631538 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" podStartSLOduration=2.336493983 podStartE2EDuration="6.63152169s" podCreationTimestamp="2026-04-17 16:38:10 +0000 UTC" firstStartedPulling="2026-04-17 16:38:11.95314659 +0000 UTC m=+395.191662350" lastFinishedPulling="2026-04-17 16:38:16.248174298 +0000 UTC m=+399.486690057" observedRunningTime="2026-04-17 16:38:16.628935438 +0000 UTC m=+399.867451216" watchObservedRunningTime="2026-04-17 16:38:16.63152169 +0000 UTC m=+399.870037468" Apr 17 16:38:22.542231 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:22.542200 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-skv55" Apr 17 16:38:47.542085 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:47.542045 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-449rq" Apr 17 16:38:47.545159 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:47.545138 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:49.285011 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.284072 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:49.285011 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.284441 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" podUID="06646c4e-8609-49b7-a255-41cbfb0ead68" containerName="manager" containerID="cri-o://563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7" gracePeriod=10 Apr 17 16:38:49.327312 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.327286 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-q9kgs"] Apr 17 16:38:49.373265 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.373242 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-q9kgs"] Apr 17 16:38:49.373369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.373358 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.492579 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.492547 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e67d4fb-3af2-480a-9d00-a3b393101eca-cert\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.492698 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.492617 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5stb\" (UniqueName: \"kubernetes.io/projected/7e67d4fb-3af2-480a-9d00-a3b393101eca-kube-api-access-n5stb\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.533552 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.533529 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:49.593605 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.593513 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e67d4fb-3af2-480a-9d00-a3b393101eca-cert\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.593605 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.593583 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5stb\" (UniqueName: \"kubernetes.io/projected/7e67d4fb-3af2-480a-9d00-a3b393101eca-kube-api-access-n5stb\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.596087 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.596063 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e67d4fb-3af2-480a-9d00-a3b393101eca-cert\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.604047 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.604004 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5stb\" (UniqueName: \"kubernetes.io/projected/7e67d4fb-3af2-480a-9d00-a3b393101eca-kube-api-access-n5stb\") pod \"kserve-controller-manager-85bb65f8c4-q9kgs\" (UID: \"7e67d4fb-3af2-480a-9d00-a3b393101eca\") " pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.641369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.641336 2548 generic.go:358] "Generic (PLEG): container finished" podID="06646c4e-8609-49b7-a255-41cbfb0ead68" containerID="563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7" exitCode=0 Apr 17 16:38:49.641515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.641409 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" Apr 17 16:38:49.641515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.641425 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" event={"ID":"06646c4e-8609-49b7-a255-41cbfb0ead68","Type":"ContainerDied","Data":"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7"} Apr 17 16:38:49.641515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.641465 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-d8wsl" event={"ID":"06646c4e-8609-49b7-a255-41cbfb0ead68","Type":"ContainerDied","Data":"07985b6ec71dbd4042f0f5b9a729ee9af8e9237f2ec2c6b20f58a0a2c09f1fb0"} Apr 17 16:38:49.641515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.641486 2548 scope.go:117] "RemoveContainer" containerID="563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7" Apr 17 16:38:49.649010 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.648995 2548 scope.go:117] "RemoveContainer" containerID="563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7" Apr 17 16:38:49.649272 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:38:49.649250 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7\": container with ID starting with 563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7 not found: ID does not exist" containerID="563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7" Apr 17 16:38:49.649342 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.649279 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7"} err="failed to get container status \"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7\": rpc error: code = NotFound desc = could not find container \"563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7\": container with ID starting with 563514e146dd7407a88691e2b2198efcfbaa96d79be7bddd67f33353ee5242e7 not found: ID does not exist" Apr 17 16:38:49.694149 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.694120 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85nng\" (UniqueName: \"kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng\") pod \"06646c4e-8609-49b7-a255-41cbfb0ead68\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " Apr 17 16:38:49.694285 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.694166 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") pod \"06646c4e-8609-49b7-a255-41cbfb0ead68\" (UID: \"06646c4e-8609-49b7-a255-41cbfb0ead68\") " Apr 17 16:38:49.696397 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.696366 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert" (OuterVolumeSpecName: "cert") pod "06646c4e-8609-49b7-a255-41cbfb0ead68" (UID: "06646c4e-8609-49b7-a255-41cbfb0ead68"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:38:49.696485 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.696389 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng" (OuterVolumeSpecName: "kube-api-access-85nng") pod "06646c4e-8609-49b7-a255-41cbfb0ead68" (UID: "06646c4e-8609-49b7-a255-41cbfb0ead68"). InnerVolumeSpecName "kube-api-access-85nng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:38:49.737181 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.737153 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:49.795835 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.795801 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85nng\" (UniqueName: \"kubernetes.io/projected/06646c4e-8609-49b7-a255-41cbfb0ead68-kube-api-access-85nng\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:38:49.795835 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.795831 2548 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06646c4e-8609-49b7-a255-41cbfb0ead68-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:38:49.863387 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.863367 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-q9kgs"] Apr 17 16:38:49.865541 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:38:49.865512 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e67d4fb_3af2_480a_9d00_a3b393101eca.slice/crio-d772024886d840a641649a690f7414266b84e76d90dc84359c43fa30f016445f WatchSource:0}: Error finding container d772024886d840a641649a690f7414266b84e76d90dc84359c43fa30f016445f: Status 404 returned error can't find the container with id d772024886d840a641649a690f7414266b84e76d90dc84359c43fa30f016445f Apr 17 16:38:49.973680 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.973650 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:49.983181 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:49.983158 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-d8wsl"] Apr 17 16:38:50.645777 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:50.645696 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" event={"ID":"7e67d4fb-3af2-480a-9d00-a3b393101eca","Type":"ContainerStarted","Data":"083b49318a5f2fcddc144184cf2180eb25a07d5ba775572e856700bc99aab8b6"} Apr 17 16:38:50.645777 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:50.645732 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" event={"ID":"7e67d4fb-3af2-480a-9d00-a3b393101eca","Type":"ContainerStarted","Data":"d772024886d840a641649a690f7414266b84e76d90dc84359c43fa30f016445f"} Apr 17 16:38:50.645777 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:50.645762 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:38:51.326144 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:38:51.326116 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06646c4e-8609-49b7-a255-41cbfb0ead68" path="/var/lib/kubelet/pods/06646c4e-8609-49b7-a255-41cbfb0ead68/volumes" Apr 17 16:39:16.466913 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.466844 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" podStartSLOduration=26.922451944 podStartE2EDuration="27.466829525s" podCreationTimestamp="2026-04-17 16:38:49 +0000 UTC" firstStartedPulling="2026-04-17 16:38:49.86676044 +0000 UTC m=+433.105276199" lastFinishedPulling="2026-04-17 16:38:50.41113802 +0000 UTC m=+433.649653780" observedRunningTime="2026-04-17 16:38:50.669778407 +0000 UTC m=+433.908294186" watchObservedRunningTime="2026-04-17 16:39:16.466829525 +0000 UTC m=+459.705345304" Apr 17 16:39:16.467265 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.467005 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69569bd448-mpfbk"] Apr 17 16:39:16.467305 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.467269 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06646c4e-8609-49b7-a255-41cbfb0ead68" containerName="manager" Apr 17 16:39:16.467305 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.467278 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="06646c4e-8609-49b7-a255-41cbfb0ead68" containerName="manager" Apr 17 16:39:16.467369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.467349 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="06646c4e-8609-49b7-a255-41cbfb0ead68" containerName="manager" Apr 17 16:39:16.470237 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.470220 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.481520 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.481494 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69569bd448-mpfbk"] Apr 17 16:39:16.590255 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590219 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590423 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590274 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnflm\" (UniqueName: \"kubernetes.io/projected/f139d23b-3c37-4b10-9082-81783cc8bff8-kube-api-access-vnflm\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590423 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590349 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-service-ca\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590423 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590392 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-console-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590423 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590408 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-trusted-ca-bundle\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590581 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590429 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-oauth-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.590581 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.590487 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-oauth-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691803 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691774 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-service-ca\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691811 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-console-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691826 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-trusted-ca-bundle\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691843 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-oauth-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691912 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-oauth-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.691954 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.691954 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.692190 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.692108 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnflm\" (UniqueName: \"kubernetes.io/projected/f139d23b-3c37-4b10-9082-81783cc8bff8-kube-api-access-vnflm\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.692584 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.692562 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-service-ca\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.692690 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.692616 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-console-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.692690 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.692616 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-oauth-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.692868 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.692849 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f139d23b-3c37-4b10-9082-81783cc8bff8-trusted-ca-bundle\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.695007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.694984 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-oauth-config\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.695103 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.695071 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f139d23b-3c37-4b10-9082-81783cc8bff8-console-serving-cert\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.700757 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.700733 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnflm\" (UniqueName: \"kubernetes.io/projected/f139d23b-3c37-4b10-9082-81783cc8bff8-kube-api-access-vnflm\") pod \"console-69569bd448-mpfbk\" (UID: \"f139d23b-3c37-4b10-9082-81783cc8bff8\") " pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.779836 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.779764 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:16.901664 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:16.901561 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69569bd448-mpfbk"] Apr 17 16:39:16.903670 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:39:16.903647 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf139d23b_3c37_4b10_9082_81783cc8bff8.slice/crio-4610de32a63703337db08cfdc27e7e251dd652d49e67dad59a722c4ed14fd47a WatchSource:0}: Error finding container 4610de32a63703337db08cfdc27e7e251dd652d49e67dad59a722c4ed14fd47a: Status 404 returned error can't find the container with id 4610de32a63703337db08cfdc27e7e251dd652d49e67dad59a722c4ed14fd47a Apr 17 16:39:17.730852 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:17.730813 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69569bd448-mpfbk" event={"ID":"f139d23b-3c37-4b10-9082-81783cc8bff8","Type":"ContainerStarted","Data":"cf847885d7a8c15b84a119cdde7f2713becd2abf136e22a6dd0ab70446bb150d"} Apr 17 16:39:17.730852 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:17.730859 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69569bd448-mpfbk" event={"ID":"f139d23b-3c37-4b10-9082-81783cc8bff8","Type":"ContainerStarted","Data":"4610de32a63703337db08cfdc27e7e251dd652d49e67dad59a722c4ed14fd47a"} Apr 17 16:39:17.750880 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:17.750835 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69569bd448-mpfbk" podStartSLOduration=1.7508216509999999 podStartE2EDuration="1.750821651s" podCreationTimestamp="2026-04-17 16:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:39:17.750155708 +0000 UTC m=+460.988671500" watchObservedRunningTime="2026-04-17 16:39:17.750821651 +0000 UTC m=+460.989337430" Apr 17 16:39:21.654938 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:21.654879 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-q9kgs" Apr 17 16:39:22.463434 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.463403 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-qm6bd"] Apr 17 16:39:22.467113 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.467088 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:22.469387 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.469365 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 16:39:22.469387 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.469383 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-mklvz\"" Apr 17 16:39:22.475852 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.475830 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-qm6bd"] Apr 17 16:39:22.540103 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.540061 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fz9s\" (UniqueName: \"kubernetes.io/projected/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-kube-api-access-7fz9s\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:22.540247 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.540133 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:22.640863 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.640832 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:22.641007 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.640930 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fz9s\" (UniqueName: \"kubernetes.io/projected/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-kube-api-access-7fz9s\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:22.641007 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:39:22.640948 2548 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 16:39:22.641007 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:39:22.641004 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs podName:9a07f87b-e090-4bc7-8839-82ce1a4ddb0c nodeName:}" failed. No retries permitted until 2026-04-17 16:39:23.140987524 +0000 UTC m=+466.379503285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs") pod "model-serving-api-86f7b4b499-qm6bd" (UID: "9a07f87b-e090-4bc7-8839-82ce1a4ddb0c") : secret "model-serving-api-tls" not found Apr 17 16:39:22.650423 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:22.650401 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fz9s\" (UniqueName: \"kubernetes.io/projected/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-kube-api-access-7fz9s\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:23.144479 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:23.144445 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:23.147048 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:23.147028 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a07f87b-e090-4bc7-8839-82ce1a4ddb0c-tls-certs\") pod \"model-serving-api-86f7b4b499-qm6bd\" (UID: \"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c\") " pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:23.378773 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:23.378743 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:23.497075 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:23.497046 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-qm6bd"] Apr 17 16:39:23.500149 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:39:23.500121 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a07f87b_e090_4bc7_8839_82ce1a4ddb0c.slice/crio-e81c571e2b6f8604109d9813a0eb1cfa4ea8cc72c9ea958f63ee1f3a37358b80 WatchSource:0}: Error finding container e81c571e2b6f8604109d9813a0eb1cfa4ea8cc72c9ea958f63ee1f3a37358b80: Status 404 returned error can't find the container with id e81c571e2b6f8604109d9813a0eb1cfa4ea8cc72c9ea958f63ee1f3a37358b80 Apr 17 16:39:23.754599 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:23.754529 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-qm6bd" event={"ID":"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c","Type":"ContainerStarted","Data":"e81c571e2b6f8604109d9813a0eb1cfa4ea8cc72c9ea958f63ee1f3a37358b80"} Apr 17 16:39:25.763137 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:25.763104 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-qm6bd" event={"ID":"9a07f87b-e090-4bc7-8839-82ce1a4ddb0c","Type":"ContainerStarted","Data":"4d5f2af36f3c78026b02a1b93cd0846823af1a2aafcfa0fbfe350ac5e97be5d8"} Apr 17 16:39:25.763489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:25.763217 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:25.781315 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:25.781275 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-qm6bd" podStartSLOduration=2.214324715 podStartE2EDuration="3.781263495s" podCreationTimestamp="2026-04-17 16:39:22 +0000 UTC" firstStartedPulling="2026-04-17 16:39:23.501743986 +0000 UTC m=+466.740259742" lastFinishedPulling="2026-04-17 16:39:25.068682765 +0000 UTC m=+468.307198522" observedRunningTime="2026-04-17 16:39:25.779019514 +0000 UTC m=+469.017535292" watchObservedRunningTime="2026-04-17 16:39:25.781263495 +0000 UTC m=+469.019779274" Apr 17 16:39:26.780123 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:26.780095 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:26.780480 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:26.780373 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:26.784881 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:26.784861 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:27.773518 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:27.773485 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69569bd448-mpfbk" Apr 17 16:39:27.834605 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:27.834566 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:39:36.771128 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:36.771100 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-qm6bd" Apr 17 16:39:52.853421 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:52.853369 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c5488589d-m2r5x" podUID="55cea390-288f-4402-91ae-e319c9c34078" containerName="console" containerID="cri-o://c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c" gracePeriod=15 Apr 17 16:39:53.100740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.100717 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5488589d-m2r5x_55cea390-288f-4402-91ae-e319c9c34078/console/0.log" Apr 17 16:39:53.100852 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.100779 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:39:53.165442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165378 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165409 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p97pz\" (UniqueName: \"kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165442 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165436 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165651 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165460 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165651 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165500 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165651 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165546 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165651 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165594 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle\") pod \"55cea390-288f-4402-91ae-e319c9c34078\" (UID: \"55cea390-288f-4402-91ae-e319c9c34078\") " Apr 17 16:39:53.165849 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165769 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config" (OuterVolumeSpecName: "console-config") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:39:53.166022 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165991 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:39:53.166115 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.165993 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca" (OuterVolumeSpecName: "service-ca") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:39:53.166417 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.166369 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:39:53.167644 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.167620 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:39:53.167740 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.167658 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz" (OuterVolumeSpecName: "kube-api-access-p97pz") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "kube-api-access-p97pz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:53.167782 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.167741 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55cea390-288f-4402-91ae-e319c9c34078" (UID: "55cea390-288f-4402-91ae-e319c9c34078"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:39:53.266254 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266231 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-oauth-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266254 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266252 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-trusted-ca-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266263 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-console-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266271 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p97pz\" (UniqueName: \"kubernetes.io/projected/55cea390-288f-4402-91ae-e319c9c34078-kube-api-access-p97pz\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266281 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-oauth-config\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266289 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55cea390-288f-4402-91ae-e319c9c34078-console-serving-cert\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.266378 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.266298 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55cea390-288f-4402-91ae-e319c9c34078-service-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:39:53.853910 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.853872 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5488589d-m2r5x_55cea390-288f-4402-91ae-e319c9c34078/console/0.log" Apr 17 16:39:53.854369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.853933 2548 generic.go:358] "Generic (PLEG): container finished" podID="55cea390-288f-4402-91ae-e319c9c34078" containerID="c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c" exitCode=2 Apr 17 16:39:53.854369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.853967 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5488589d-m2r5x" event={"ID":"55cea390-288f-4402-91ae-e319c9c34078","Type":"ContainerDied","Data":"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c"} Apr 17 16:39:53.854369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.854012 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5488589d-m2r5x" event={"ID":"55cea390-288f-4402-91ae-e319c9c34078","Type":"ContainerDied","Data":"c9a7e5e2e5192a52067be04ce0b791e153f24317347ceb809010a157645dd5be"} Apr 17 16:39:53.854369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.854016 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5488589d-m2r5x" Apr 17 16:39:53.854369 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.854029 2548 scope.go:117] "RemoveContainer" containerID="c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c" Apr 17 16:39:53.861998 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.861976 2548 scope.go:117] "RemoveContainer" containerID="c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c" Apr 17 16:39:53.862268 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:39:53.862249 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c\": container with ID starting with c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c not found: ID does not exist" containerID="c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c" Apr 17 16:39:53.862317 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.862276 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c"} err="failed to get container status \"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c\": rpc error: code = NotFound desc = could not find container \"c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c\": container with ID starting with c6f70496f6421c0157ece058a91219d016786259c87853bcad8ee604dc578e5c not found: ID does not exist" Apr 17 16:39:53.892710 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.892682 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:39:53.896317 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:53.896289 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c5488589d-m2r5x"] Apr 17 16:39:55.326314 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:39:55.326285 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cea390-288f-4402-91ae-e319c9c34078" path="/var/lib/kubelet/pods/55cea390-288f-4402-91ae-e319c9c34078/volumes" Apr 17 16:43:23.451880 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.451848 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:43:23.452334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.452188 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55cea390-288f-4402-91ae-e319c9c34078" containerName="console" Apr 17 16:43:23.452334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.452201 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cea390-288f-4402-91ae-e319c9c34078" containerName="console" Apr 17 16:43:23.452334 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.452254 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="55cea390-288f-4402-91ae-e319c9c34078" containerName="console" Apr 17 16:43:23.455074 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.455043 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.457295 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.457272 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:43:23.458295 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.458272 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-f88b3-serving-cert\"" Apr 17 16:43:23.458420 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.458308 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7cr77\"" Apr 17 16:43:23.458420 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.458272 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-f88b3-kube-rbac-proxy-sar-config\"" Apr 17 16:43:23.465350 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.465331 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:43:23.544768 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.544741 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.544947 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.544804 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.645666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.645641 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.645836 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.645699 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.646343 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.646320 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.648143 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.648117 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls\") pod \"model-chainer-raw-f88b3-6bdd8989c-6mtz9\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.765321 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.765268 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:23.883033 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.883003 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:43:23.886239 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:43:23.886190 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da04f35_cb57_4500_9533_465fb969fc44.slice/crio-02e3cf04518e6032be9f324dd11562b679983c81914f4522f16af81fe89156af WatchSource:0}: Error finding container 02e3cf04518e6032be9f324dd11562b679983c81914f4522f16af81fe89156af: Status 404 returned error can't find the container with id 02e3cf04518e6032be9f324dd11562b679983c81914f4522f16af81fe89156af Apr 17 16:43:23.888311 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:23.888295 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:43:24.499449 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:24.499412 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" event={"ID":"3da04f35-cb57-4500-9533-465fb969fc44","Type":"ContainerStarted","Data":"02e3cf04518e6032be9f324dd11562b679983c81914f4522f16af81fe89156af"} Apr 17 16:43:26.507436 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:26.507394 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" event={"ID":"3da04f35-cb57-4500-9533-465fb969fc44","Type":"ContainerStarted","Data":"aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914"} Apr 17 16:43:26.507825 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:26.507512 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:26.521997 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:26.521942 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podStartSLOduration=1.773070344 podStartE2EDuration="3.521924366s" podCreationTimestamp="2026-04-17 16:43:23 +0000 UTC" firstStartedPulling="2026-04-17 16:43:23.888417854 +0000 UTC m=+707.126933612" lastFinishedPulling="2026-04-17 16:43:25.637271877 +0000 UTC m=+708.875787634" observedRunningTime="2026-04-17 16:43:26.521429572 +0000 UTC m=+709.759945362" watchObservedRunningTime="2026-04-17 16:43:26.521924366 +0000 UTC m=+709.760440146" Apr 17 16:43:32.516429 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:32.516397 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:33.517243 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:33.517208 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:43:33.517692 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:33.517485 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" containerID="cri-o://aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914" gracePeriod=30 Apr 17 16:43:37.514174 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:37.514131 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:42.515017 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:42.514980 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:47.514573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:47.514535 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:47.514980 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:47.514654 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:43:52.514673 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:52.514634 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:43:57.514609 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:43:57.514562 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:02.514181 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:02.514144 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:44:03.541912 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:44:03.541867 2548 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da04f35_cb57_4500_9533_465fb969fc44.slice/crio-conmon-aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:44:03.542186 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:44:03.541887 2548 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da04f35_cb57_4500_9533_465fb969fc44.slice/crio-conmon-aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:44:03.624019 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.623983 2548 generic.go:358] "Generic (PLEG): container finished" podID="3da04f35-cb57-4500-9533-465fb969fc44" containerID="aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914" exitCode=0 Apr 17 16:44:03.624160 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.624032 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" event={"ID":"3da04f35-cb57-4500-9533-465fb969fc44","Type":"ContainerDied","Data":"aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914"} Apr 17 16:44:03.667037 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.667013 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:44:03.753353 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.753326 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle\") pod \"3da04f35-cb57-4500-9533-465fb969fc44\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " Apr 17 16:44:03.753508 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.753417 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls\") pod \"3da04f35-cb57-4500-9533-465fb969fc44\" (UID: \"3da04f35-cb57-4500-9533-465fb969fc44\") " Apr 17 16:44:03.753722 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.753703 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3da04f35-cb57-4500-9533-465fb969fc44" (UID: "3da04f35-cb57-4500-9533-465fb969fc44"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:44:03.755633 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.755600 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3da04f35-cb57-4500-9533-465fb969fc44" (UID: "3da04f35-cb57-4500-9533-465fb969fc44"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:44:03.854180 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.854140 2548 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3da04f35-cb57-4500-9533-465fb969fc44-proxy-tls\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:44:03.854180 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:03.854177 2548 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da04f35-cb57-4500-9533-465fb969fc44-openshift-service-ca-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:44:04.628105 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:04.628071 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" event={"ID":"3da04f35-cb57-4500-9533-465fb969fc44","Type":"ContainerDied","Data":"02e3cf04518e6032be9f324dd11562b679983c81914f4522f16af81fe89156af"} Apr 17 16:44:04.628105 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:04.628089 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9" Apr 17 16:44:04.628573 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:04.628115 2548 scope.go:117] "RemoveContainer" containerID="aa406ec2bd54eea0a8d1cf4daec82bebb703beb74848ecb40b17e99f885fb914" Apr 17 16:44:04.650515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:04.650488 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:44:04.654368 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:04.654350 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-f88b3-6bdd8989c-6mtz9"] Apr 17 16:44:05.325859 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:44:05.325815 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da04f35-cb57-4500-9533-465fb969fc44" path="/var/lib/kubelet/pods/3da04f35-cb57-4500-9533-465fb969fc44/volumes" Apr 17 16:45:03.779977 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.779884 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:03.780321 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.780211 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" Apr 17 16:45:03.780321 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.780223 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" Apr 17 16:45:03.780321 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.780269 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="3da04f35-cb57-4500-9533-465fb969fc44" containerName="model-chainer-raw-f88b3" Apr 17 16:45:03.783153 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.783135 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.785512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.785483 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-10815-serving-cert\"" Apr 17 16:45:03.785512 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.785487 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-10815-kube-rbac-proxy-sar-config\"" Apr 17 16:45:03.785691 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.785514 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7cr77\"" Apr 17 16:45:03.785691 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.785567 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:45:03.791996 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.791975 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:03.873205 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.873177 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.873343 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.873232 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.974511 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.974470 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.974642 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.974550 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.975145 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.975121 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:03.976991 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:03.976973 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls\") pod \"model-chainer-raw-hpa-10815-748448b99c-gzph9\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:04.093598 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.093557 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:04.213511 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.213482 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:04.217009 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:45:04.216979 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b43b45_0905_4c46_9a4a_3be4dd08cb67.slice/crio-2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5 WatchSource:0}: Error finding container 2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5: Status 404 returned error can't find the container with id 2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5 Apr 17 16:45:04.816847 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.816806 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" event={"ID":"13b43b45-0905-4c46-9a4a-3be4dd08cb67","Type":"ContainerStarted","Data":"3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317"} Apr 17 16:45:04.816847 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.816847 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" event={"ID":"13b43b45-0905-4c46-9a4a-3be4dd08cb67","Type":"ContainerStarted","Data":"2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5"} Apr 17 16:45:04.817358 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.816929 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:04.832655 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:04.832588 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podStartSLOduration=1.832574752 podStartE2EDuration="1.832574752s" podCreationTimestamp="2026-04-17 16:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:45:04.83087417 +0000 UTC m=+808.069389948" watchObservedRunningTime="2026-04-17 16:45:04.832574752 +0000 UTC m=+808.071090531" Apr 17 16:45:10.826471 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:10.826444 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:13.838221 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:13.838181 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:13.838655 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:13.838465 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" containerID="cri-o://3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317" gracePeriod=30 Apr 17 16:45:15.825463 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:15.825421 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:20.824735 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:20.824698 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:25.824562 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:25.824515 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:25.824942 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:25.824621 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:30.824616 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:30.824566 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:35.824619 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:35.824574 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:40.824916 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:40.824849 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:43.866646 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:45:43.866615 2548 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b43b45_0905_4c46_9a4a_3be4dd08cb67.slice/crio-conmon-3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:45:43.867079 ip-10-0-141-239 kubenswrapper[2548]: E0417 16:45:43.866625 2548 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b43b45_0905_4c46_9a4a_3be4dd08cb67.slice/crio-2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b43b45_0905_4c46_9a4a_3be4dd08cb67.slice/crio-conmon-3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:45:43.935269 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:43.935232 2548 generic.go:358] "Generic (PLEG): container finished" podID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerID="3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317" exitCode=0 Apr 17 16:45:43.935414 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:43.935267 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" event={"ID":"13b43b45-0905-4c46-9a4a-3be4dd08cb67","Type":"ContainerDied","Data":"3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317"} Apr 17 16:45:43.983342 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:43.983322 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:44.075068 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.075041 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle\") pod \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " Apr 17 16:45:44.075232 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.075086 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls\") pod \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\" (UID: \"13b43b45-0905-4c46-9a4a-3be4dd08cb67\") " Apr 17 16:45:44.075384 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.075360 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "13b43b45-0905-4c46-9a4a-3be4dd08cb67" (UID: "13b43b45-0905-4c46-9a4a-3be4dd08cb67"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:45:44.077298 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.077277 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13b43b45-0905-4c46-9a4a-3be4dd08cb67" (UID: "13b43b45-0905-4c46-9a4a-3be4dd08cb67"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:44.176736 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.176653 2548 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13b43b45-0905-4c46-9a4a-3be4dd08cb67-openshift-service-ca-bundle\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.176736 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.176691 2548 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13b43b45-0905-4c46-9a4a-3be4dd08cb67-proxy-tls\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.938810 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.938783 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" Apr 17 16:45:44.938810 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.938788 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9" event={"ID":"13b43b45-0905-4c46-9a4a-3be4dd08cb67","Type":"ContainerDied","Data":"2073ea4f6e2f4cb18c4a7f79ef4475d7795e1b4700df3dc1b8c077ee44d53ae5"} Apr 17 16:45:44.939297 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.938825 2548 scope.go:117] "RemoveContainer" containerID="3c68cee96f9a487f6a5d9f6c86a05c1d82215f2c2896f0fc0c09a814edb74317" Apr 17 16:45:44.959694 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.959671 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:44.962973 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:44.962952 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-10815-748448b99c-gzph9"] Apr 17 16:45:45.325661 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:45:45.325629 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" path="/var/lib/kubelet/pods/13b43b45-0905-4c46-9a4a-3be4dd08cb67/volumes" Apr 17 16:54:16.959079 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:16.959037 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cp2k7_8405d132-1e05-4ddb-89bd-dcec490db483/global-pull-secret-syncer/0.log" Apr 17 16:54:17.107705 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:17.107677 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g48dw_eaedeb89-807e-4759-a3fe-7ccfc919f4d7/konnectivity-agent/0.log" Apr 17 16:54:17.216666 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:17.216594 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-239.ec2.internal_1deed1599e86e9837e6b4d3fcce1e268/haproxy/0.log" Apr 17 16:54:21.204239 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.204210 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fbc7d488c-8tbzf_31746d46-0cf2-488b-9f68-e292c43470a6/metrics-server/0.log" Apr 17 16:54:21.230923 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.230879 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-nxvml_bd0ee333-af57-4d69-90f8-950628bf752e/monitoring-plugin/0.log" Apr 17 16:54:21.334515 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.334488 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dnmj9_8256c616-711b-4108-b911-6d292aed26c2/node-exporter/0.log" Apr 17 16:54:21.358888 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.358867 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dnmj9_8256c616-711b-4108-b911-6d292aed26c2/kube-rbac-proxy/0.log" Apr 17 16:54:21.385499 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.385483 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dnmj9_8256c616-711b-4108-b911-6d292aed26c2/init-textfile/0.log" Apr 17 16:54:21.899686 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.899662 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-cgbt4_f56f8820-e530-469d-92c4-2ff27422a302/prometheus-operator-admission-webhook/0.log" Apr 17 16:54:21.940502 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.940457 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f9d4b86d6-kxw45_029fd86d-6da2-4fcd-9f62-3a1d068b6866/telemeter-client/0.log" Apr 17 16:54:21.969565 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.969545 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f9d4b86d6-kxw45_029fd86d-6da2-4fcd-9f62-3a1d068b6866/reload/0.log" Apr 17 16:54:21.994932 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:21.994890 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f9d4b86d6-kxw45_029fd86d-6da2-4fcd-9f62-3a1d068b6866/kube-rbac-proxy/0.log" Apr 17 16:54:23.941392 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.941367 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69569bd448-mpfbk_f139d23b-3c37-4b10-9082-81783cc8bff8/console/0.log" Apr 17 16:54:23.972520 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.972495 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-flqq6_21c90255-d7fb-4403-bf8c-e896082f1d3c/download-server/0.log" Apr 17 16:54:23.990382 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.990355 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th"] Apr 17 16:54:23.990660 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.990648 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" Apr 17 16:54:23.990711 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.990662 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" Apr 17 16:54:23.990749 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.990716 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="13b43b45-0905-4c46-9a4a-3be4dd08cb67" containerName="model-chainer-raw-hpa-10815" Apr 17 16:54:23.993624 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.993605 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:23.995770 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.995753 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"kube-root-ca.crt\"" Apr 17 16:54:23.996489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.996472 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"openshift-service-ca.crt\"" Apr 17 16:54:23.996577 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:23.996473 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-l7rfq\"/\"default-dockercfg-9vr2t\"" Apr 17 16:54:24.002439 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.002419 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th"] Apr 17 16:54:24.104467 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.104442 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclmh\" (UniqueName: \"kubernetes.io/projected/d26653a9-7e06-4bce-9f41-1754ea2be276-kube-api-access-dclmh\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.104612 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.104472 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-lib-modules\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.104612 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.104494 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-proc\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.104612 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.104576 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-sys\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.104612 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.104603 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-podres\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.205922 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.205818 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-sys\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.205922 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.205862 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-podres\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.205943 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dclmh\" (UniqueName: \"kubernetes.io/projected/d26653a9-7e06-4bce-9f41-1754ea2be276-kube-api-access-dclmh\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.205962 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-sys\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.205970 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-lib-modules\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.206009 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-proc\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.206079 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-lib-modules\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.206083 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-podres\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.206095 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.206093 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d26653a9-7e06-4bce-9f41-1754ea2be276-proc\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.214102 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.214082 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclmh\" (UniqueName: \"kubernetes.io/projected/d26653a9-7e06-4bce-9f41-1754ea2be276-kube-api-access-dclmh\") pod \"perf-node-gather-daemonset-q86th\" (UID: \"d26653a9-7e06-4bce-9f41-1754ea2be276\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.303716 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.303680 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.421664 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.421632 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th"] Apr 17 16:54:24.424502 ip-10-0-141-239 kubenswrapper[2548]: W0417 16:54:24.424473 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd26653a9_7e06_4bce_9f41_1754ea2be276.slice/crio-7bfe44cd19e190b3407f6315dc964bd75fd42ca814406fcc2c1c39b97ea81f52 WatchSource:0}: Error finding container 7bfe44cd19e190b3407f6315dc964bd75fd42ca814406fcc2c1c39b97ea81f52: Status 404 returned error can't find the container with id 7bfe44cd19e190b3407f6315dc964bd75fd42ca814406fcc2c1c39b97ea81f52 Apr 17 16:54:24.426064 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.426039 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:54:24.514986 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.514958 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" event={"ID":"d26653a9-7e06-4bce-9f41-1754ea2be276","Type":"ContainerStarted","Data":"7c3ed99b7b94d8cb91ed7f8a2a05597f0059b3db9b1416dd85eb900356361957"} Apr 17 16:54:24.514986 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.514991 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" event={"ID":"d26653a9-7e06-4bce-9f41-1754ea2be276","Type":"ContainerStarted","Data":"7bfe44cd19e190b3407f6315dc964bd75fd42ca814406fcc2c1c39b97ea81f52"} Apr 17 16:54:24.515190 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.515105 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:24.529489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.529451 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" podStartSLOduration=1.5294364759999999 podStartE2EDuration="1.529436476s" podCreationTimestamp="2026-04-17 16:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:24.529210763 +0000 UTC m=+1367.767726524" watchObservedRunningTime="2026-04-17 16:54:24.529436476 +0000 UTC m=+1367.767952257" Apr 17 16:54:24.995741 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:24.995717 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w5xn_d91bd4ff-4efd-449d-b375-d8843508d28c/dns/0.log" Apr 17 16:54:25.016425 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:25.016377 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w5xn_d91bd4ff-4efd-449d-b375-d8843508d28c/kube-rbac-proxy/0.log" Apr 17 16:54:25.152985 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:25.152964 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h548j_c4bbbe54-5b68-47bf-99a9-c5e02ce391cd/dns-node-resolver/0.log" Apr 17 16:54:25.622154 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:25.622113 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p555x_883f4572-082b-45cf-809b-87efb82fbb9c/node-ca/0.log" Apr 17 16:54:26.646472 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:26.646447 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-96878_2e5a8842-0933-4179-9241-eb13bf048769/serve-healthcheck-canary/0.log" Apr 17 16:54:27.017997 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:27.017918 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t44zr_58c1791e-3029-4f4d-be45-e94ca7e72a6e/kube-rbac-proxy/0.log" Apr 17 16:54:27.040630 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:27.040605 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t44zr_58c1791e-3029-4f4d-be45-e94ca7e72a6e/exporter/0.log" Apr 17 16:54:27.062425 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:27.062391 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t44zr_58c1791e-3029-4f4d-be45-e94ca7e72a6e/extractor/0.log" Apr 17 16:54:29.103832 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:29.103802 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85bb65f8c4-q9kgs_7e67d4fb-3af2-480a-9d00-a3b393101eca/manager/0.log" Apr 17 16:54:29.124390 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:29.124369 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-449rq_50ba178c-7c75-47c1-bd38-740ebeecf1fc/manager/0.log" Apr 17 16:54:29.148361 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:29.148340 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-qm6bd_9a07f87b-e090-4bc7-8839-82ce1a4ddb0c/server/0.log" Apr 17 16:54:29.338604 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:29.338572 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-skv55_9e91b1f8-9725-4761-b726-d5e59ab3c67c/seaweedfs/0.log" Apr 17 16:54:30.533451 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:30.533426 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-q86th" Apr 17 16:54:34.647093 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.647067 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/kube-multus-additional-cni-plugins/0.log" Apr 17 16:54:34.678237 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.678215 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/egress-router-binary-copy/0.log" Apr 17 16:54:34.699460 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.699440 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/cni-plugins/0.log" Apr 17 16:54:34.720762 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.720741 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/bond-cni-plugin/0.log" Apr 17 16:54:34.741199 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.741175 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/routeoverride-cni/0.log" Apr 17 16:54:34.762967 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.762949 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/whereabouts-cni-bincopy/0.log" Apr 17 16:54:34.783436 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:34.783418 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2qscf_c756a090-293e-4944-9021-f8de796a8b45/whereabouts-cni/0.log" Apr 17 16:54:35.152553 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:35.152527 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f29ht_db6e591a-0918-41c9-a16d-9999ecbf1df5/kube-multus/0.log" Apr 17 16:54:35.258000 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:35.257975 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zsg2s_60cbc498-937e-4f93-95af-294c0a8e7beb/network-metrics-daemon/0.log" Apr 17 16:54:35.276966 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:35.276942 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zsg2s_60cbc498-937e-4f93-95af-294c0a8e7beb/kube-rbac-proxy/0.log" Apr 17 16:54:36.391708 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.391681 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/ovn-controller/0.log" Apr 17 16:54:36.423495 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.423472 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/ovn-acl-logging/0.log" Apr 17 16:54:36.444297 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.444280 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/kube-rbac-proxy-node/0.log" Apr 17 16:54:36.468682 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.468664 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:54:36.496489 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.496472 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/northd/0.log" Apr 17 16:54:36.521669 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.521640 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/nbdb/0.log" Apr 17 16:54:36.546099 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.546080 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/sbdb/0.log" Apr 17 16:54:36.639578 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:36.639526 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lq8np_6f8baf84-e2c1-4c17-bc5c-e068af8f6439/ovnkube-controller/0.log" Apr 17 16:54:37.829148 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:37.829117 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-29tlc_855921ad-75be-4568-9884-d3f6c5e1a862/network-check-target-container/0.log" Apr 17 16:54:38.775987 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:38.775949 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-csqpf_bfa866dd-f0dc-4c76-ac8b-1e2b8c5e7a90/iptables-alerter/0.log" Apr 17 16:54:39.441829 ip-10-0-141-239 kubenswrapper[2548]: I0417 16:54:39.441788 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x2tmr_a97f9be6-2d21-46a6-95a1-50608634459b/tuned/0.log"