Apr 24 14:24:13.561383 ip-10-0-138-116 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:24:13.992702 ip-10-0-138-116 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:13.992702 ip-10-0-138-116 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:24:13.992702 ip-10-0-138-116 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:13.992702 ip-10-0-138-116 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:24:13.992702 ip-10-0-138-116 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:13.994990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.994906 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:24:13.998747 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998733 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998749 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998753 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998756 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998759 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998762 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998764 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998767 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998770 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998772 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998775 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998777 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998780 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998783 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998785 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998788 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998791 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998794 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:13.998787 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998796 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998799 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998802 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998805 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998810 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998813 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998816 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998818 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998820 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998823 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998825 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998828 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998830 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998833 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998835 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998837 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998840 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998842 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998846 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998848 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:13.999220 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998851 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998853 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998855 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998858 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998860 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998863 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998865 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998867 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998870 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998872 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998874 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998877 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998879 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998882 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998885 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998888 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998891 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998893 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998896 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:13.999680 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998898 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998900 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998903 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998907 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998910 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998913 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998918 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998921 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998924 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998927 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998930 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998932 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998935 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998937 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998940 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998942 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998945 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998947 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998950 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:14.000133 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998953 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998955 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998958 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998960 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998963 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998965 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998967 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998970 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998980 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.998983 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999403 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999409 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999412 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999415 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999418 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999421 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999424 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999427 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999430 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:14.000578 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999432 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999435 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999438 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999441 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999443 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999446 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999448 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999451 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999453 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999455 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999459 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999462 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999466 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999469 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999471 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999474 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999477 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999479 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999482 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:14.001021 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999484 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999486 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999490 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999493 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999496 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999498 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999501 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999503 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999505 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999509 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999511 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999513 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999516 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999518 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999521 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999523 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999526 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999528 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999531 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999535 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:14.001493 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999538 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999540 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999543 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999545 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999547 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999550 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999554 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999556 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999558 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999561 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999563 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999566 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999569 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999571 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999573 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999576 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999578 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999580 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999583 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999585 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:14.001972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999587 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999590 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999592 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999595 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999597 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999600 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999603 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999605 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999608 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999610 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999612 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999615 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999617 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999619 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999622 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999624 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999627 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:13.999630 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999694 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999701 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999708 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:24:14.002473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999712 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999717 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999722 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999728 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999734 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999739 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999744 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999751 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999754 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999759 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999762 2571 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999765 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999769 2571 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999772 2571 flags.go:64] FLAG: --cloud-config="" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999777 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999779 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999785 2571 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999787 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999791 2571 flags.go:64] FLAG: --config-dir="" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999794 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999797 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999801 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999804 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999807 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:24:14.002988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999810 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999813 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999815 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999818 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999822 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999824 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999828 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999836 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999839 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999842 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999845 2571 flags.go:64] FLAG: --enable-server="true" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999848 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999852 2571 flags.go:64] FLAG: --event-burst="100" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999855 2571 flags.go:64] FLAG: --event-qps="50" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999858 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999861 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999863 2571 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999867 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999870 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999873 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999876 2571 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999879 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999882 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999885 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999888 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:24:14.003592 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999890 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999893 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999896 2571 flags.go:64] FLAG: --feature-gates="" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999899 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999902 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999906 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999909 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999912 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999914 2571 flags.go:64] FLAG: --help="false" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999917 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999922 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999925 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999927 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999931 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999935 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999938 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999941 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999944 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999947 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999949 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999953 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999955 2571 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999959 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999961 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:24:14.004218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999964 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999967 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999970 2571 flags.go:64] FLAG: --lock-file="" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999973 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999975 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999978 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999983 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999986 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999989 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999991 2571 flags.go:64] FLAG: --logging-format="text" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999994 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:13.999998 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000000 2571 flags.go:64] FLAG: --manifest-url="" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000003 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000007 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000010 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000014 2571 flags.go:64] FLAG: --max-pods="110" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000017 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000021 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000024 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000027 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000030 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000034 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000037 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000045 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:24:14.004802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000048 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000051 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000054 2571 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000057 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000063 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000065 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000068 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000071 2571 flags.go:64] FLAG: --port="10250" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000074 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000077 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d3cef55897136e5d" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000080 2571 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000083 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000086 2571 flags.go:64] FLAG: --register-node="true" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000089 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000092 2571 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000109 2571 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000112 2571 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000115 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000118 2571 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000122 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000125 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000128 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000131 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000133 2571 flags.go:64] FLAG: --runonce="false" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000136 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:24:14.005444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000141 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000144 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000147 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000149 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000152 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000156 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000159 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000162 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000165 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000168 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000171 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000173 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000176 2571 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000179 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000184 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000187 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000190 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000194 2571 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000196 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000199 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000202 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000204 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000208 2571 flags.go:64] FLAG: --v="2" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000212 2571 flags.go:64] FLAG: --version="false" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000216 2571 flags.go:64] FLAG: --vmodule="" Apr 24 14:24:14.006142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000219 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000222 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000315 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000318 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000321 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000324 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000327 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000330 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000333 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000335 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000338 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000340 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000344 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000346 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000349 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000352 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000355 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000357 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000360 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000362 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:14.006785 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000365 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000367 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000370 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000374 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000377 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000380 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000382 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000385 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000387 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000389 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000392 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000394 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000397 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000400 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000403 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000407 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000410 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000413 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000415 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000419 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:14.007312 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000422 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000424 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000428 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000430 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000434 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000437 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000439 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000442 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000445 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000448 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000450 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000453 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000455 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000458 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000460 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000463 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000465 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000467 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000470 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000472 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:14.007864 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000475 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000477 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000480 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000482 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000484 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000487 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000489 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000492 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000494 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000496 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000499 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000503 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000505 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000508 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000510 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000512 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000516 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000518 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000521 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000523 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:14.008403 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000525 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000528 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000531 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000533 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000536 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000538 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000540 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.000543 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.000548 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.007445 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.007461 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007524 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007529 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007533 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007536 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:14.008974 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007539 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007543 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007546 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007548 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007551 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007553 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007555 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007558 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007560 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007563 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007565 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007568 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007571 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007573 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007576 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007579 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007581 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007584 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007586 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:14.009386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007589 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007592 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007594 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007597 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007600 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007602 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007606 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007611 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007615 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007618 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007621 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007623 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007626 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007628 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007631 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007633 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007636 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007638 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007641 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007643 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:14.009889 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007646 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007649 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007651 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007653 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007656 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007658 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007661 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007664 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007666 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007669 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007671 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007674 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007676 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007679 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007681 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007684 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007686 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007689 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007691 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007694 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:14.010417 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007697 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007700 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007703 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007705 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007708 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007711 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007713 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007716 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007718 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007721 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007723 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007725 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007728 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007731 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007735 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007737 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007740 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007743 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007745 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007747 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:14.010898 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007750 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007752 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007755 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.007760 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007863 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007869 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007872 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007875 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007879 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007883 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007886 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007889 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007891 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007894 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007897 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:14.011386 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007900 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007903 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007905 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007908 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007910 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007913 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007915 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007917 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007920 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007922 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007924 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007927 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007930 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007932 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007934 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007937 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007939 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007942 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007944 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007946 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:14.011744 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007949 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007951 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007954 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007956 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007959 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007961 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007963 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007966 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007969 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007971 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007974 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007978 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007981 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007984 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007987 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007990 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007993 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007996 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.007998 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:14.012246 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008001 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008003 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008006 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008008 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008011 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008013 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008015 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008018 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008020 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008023 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008025 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008028 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008030 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008032 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008035 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008037 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008040 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008043 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008045 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008047 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:14.012688 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008050 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008052 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008054 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008057 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008060 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008062 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008065 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008067 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008070 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008072 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008074 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008076 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008079 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008081 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008083 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:14.013181 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:14.008086 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:14.013528 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.008090 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:14.013528 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.008837 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:24:14.013528 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.012944 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:24:14.014788 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.014776 2571 server.go:1019] "Starting client certificate rotation" Apr 24 14:24:14.014890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.014874 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:14.014927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.014917 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:14.036415 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.036397 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:14.038773 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.038756 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:14.052026 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.052005 2571 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:24:14.057177 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.057158 2571 log.go:25] "Validated CRI v1 image API" Apr 24 14:24:14.058337 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.058320 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:24:14.062278 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.062260 2571 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9194e093-795e-4b9b-b524-bf2c9ca186b4:/dev/nvme0n1p4 fe29a8da-774a-4c93-b14f-e46a860ac215:/dev/nvme0n1p3] Apr 24 14:24:14.062356 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.062278 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:24:14.065863 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.065842 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:14.068044 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.067937 2571 manager.go:217] Machine: {Timestamp:2026-04-24 14:24:14.066212738 +0000 UTC m=+0.397490492 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200090 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22969c4b6997fa27333c712a0ce4e9 SystemUUID:ec22969c-4b69-97fa-2733-3c712a0ce4e9 BootID:b528987c-906f-4ea3-8293-a13204f8a349 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bc:65:f4:06:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bc:65:f4:06:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:25:62:d0:b1:7f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:24:14.068044 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.068038 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:24:14.068180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.068132 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:24:14.069227 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.069202 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:24:14.069361 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.069229 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-116.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:24:14.069404 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.069370 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:24:14.069404 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.069378 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:24:14.069404 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.069390 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:14.070080 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.070069 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:14.071311 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.071301 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:14.071414 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.071406 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:24:14.073636 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.073627 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:24:14.073674 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.073645 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:24:14.073674 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.073658 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:24:14.073674 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.073667 2571 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:24:14.073787 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.073676 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:24:14.074730 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.074715 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:14.074730 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.074732 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:14.077511 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.077496 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:24:14.078722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.078704 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:24:14.080325 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080311 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080329 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080339 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080347 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080355 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080365 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080373 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080382 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080391 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:24:14.080405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080401 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:24:14.080669 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080413 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:24:14.080669 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.080426 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:24:14.081311 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.081300 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:24:14.081371 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.081313 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:24:14.083331 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.083311 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-116.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:24:14.083331 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.083322 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-116.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:24:14.083472 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.083347 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:24:14.084941 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.084925 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:24:14.085021 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.084967 2571 server.go:1295] "Started kubelet" Apr 24 14:24:14.085078 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.085057 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:24:14.085144 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.085062 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:24:14.085183 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.085158 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:24:14.085707 ip-10-0-138-116 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:24:14.086277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.086264 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:24:14.088150 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.088115 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:24:14.089365 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.089338 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hx9w7" Apr 24 14:24:14.093682 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.093661 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:24:14.093891 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.093881 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:24:14.093945 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.093897 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:14.094482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094454 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:24:14.094482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094456 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:24:14.094482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094482 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:24:14.094780 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.094751 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.094848 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094790 2571 factory.go:153] Registering CRI-O factory Apr 24 14:24:14.094848 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094804 2571 factory.go:223] Registration of the crio container factory successfully Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094848 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094860 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094851 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094880 2571 factory.go:55] Registering systemd factory Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094890 2571 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094912 2571 factory.go:103] Registering Raw factory Apr 24 14:24:14.094933 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.094927 2571 manager.go:1196] Started watching for new ooms in manager Apr 24 14:24:14.095750 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.095735 2571 manager.go:319] Starting recovery of all containers Apr 24 14:24:14.096908 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.096887 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hx9w7" Apr 24 14:24:14.100027 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.099724 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 14:24:14.100886 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.099931 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-116.ec2.internal.18a9510f4a291777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-116.ec2.internal,UID:ip-10-0-138-116.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-116.ec2.internal,},FirstTimestamp:2026-04-24 14:24:14.084937591 +0000 UTC m=+0.416215346,LastTimestamp:2026-04-24 14:24:14.084937591 +0000 UTC m=+0.416215346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-116.ec2.internal,}" Apr 24 14:24:14.100987 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.100968 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-116.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 14:24:14.105203 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.105188 2571 manager.go:324] Recovery completed Apr 24 14:24:14.109680 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.109668 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.111917 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.111899 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.111980 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.111939 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.111980 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.111951 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.112419 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.112406 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:24:14.112419 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.112417 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:24:14.112510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.112431 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:14.114584 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.114571 2571 policy_none.go:49] "None policy: Start" Apr 24 14:24:14.114640 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.114587 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:24:14.114640 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.114596 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:24:14.153670 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.153651 2571 manager.go:341] "Starting Device Plugin manager" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.153686 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.153699 2571 server.go:85] "Starting device plugin registration server" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.153977 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.153989 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.154082 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.154188 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.154198 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.154666 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:24:14.171687 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.154702 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.223240 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.223211 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:24:14.224384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.224368 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:24:14.224473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.224393 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:24:14.224473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.224409 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:24:14.224473 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.224416 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:24:14.224473 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.224445 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:24:14.227653 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.227636 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:14.254960 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.254944 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.256000 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.255978 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.256078 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.256009 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.256078 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.256024 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.256078 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.256051 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.264415 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.264398 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.264467 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.264419 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-116.ec2.internal\": node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.289690 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.289667 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.324740 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.324706 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal"] Apr 24 14:24:14.324817 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.324779 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.325593 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.325577 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.325699 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.325610 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.325699 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.325624 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.327744 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.327730 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.327882 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.327867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.327928 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.327902 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.328444 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328426 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.328508 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328460 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.328508 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328471 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.328508 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328426 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.328617 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328530 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.328617 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.328542 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.330557 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.330542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.330557 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.330564 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:14.331126 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.331111 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:14.331198 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.331140 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:14.331198 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.331149 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:14.360606 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.360591 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-116.ec2.internal\" not found" node="ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.364799 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.364780 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-116.ec2.internal\" not found" node="ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.390773 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.390755 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.396075 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.396060 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63ded8f515ead39ec1575ef940918d4c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-116.ec2.internal\" (UID: \"63ded8f515ead39ec1575ef940918d4c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.396148 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.396084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.396148 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.396114 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.491336 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.491313 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.496679 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.496732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.496732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63ded8f515ead39ec1575ef940918d4c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-116.ec2.internal\" (UID: \"63ded8f515ead39ec1575ef940918d4c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.496824 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63ded8f515ead39ec1575ef940918d4c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-116.ec2.internal\" (UID: \"63ded8f515ead39ec1575ef940918d4c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.496824 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.496824 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.496778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67ed165269d2376e0c4ebf616952f5d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal\" (UID: \"67ed165269d2376e0c4ebf616952f5d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.592133 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.592054 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.661584 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.661559 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.667111 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.667080 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:14.692626 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.692603 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.793094 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.793065 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.893749 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:14.893676 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-116.ec2.internal\" not found" Apr 24 14:24:14.894255 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.894240 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:14.994474 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:14.994439 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" Apr 24 14:24:15.006068 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.006048 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:15.007621 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.007608 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:15.014108 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.014075 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:24:15.014221 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.014204 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:15.014271 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.014251 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:15.014312 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.014266 2571 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ac967476e400b4b769414ee30f15823b-8465cee92b2a1494.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.138.116:35694->54.158.36.98:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" Apr 24 14:24:15.074411 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.074385 2571 apiserver.go:52] "Watching apiserver" Apr 24 14:24:15.088368 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.088344 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:24:15.088717 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.088695 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jxm64","openshift-network-operator/iptables-alerter-hwhv7","openshift-cluster-node-tuning-operator/tuned-xgvkw","openshift-dns/node-resolver-2rs7b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal","openshift-multus/network-metrics-daemon-dkhdd","openshift-network-diagnostics/network-check-target-jlk8v","openshift-ovn-kubernetes/ovnkube-node-7ksfw","kube-system/konnectivity-agent-f25sj","kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc","openshift-image-registry/node-ca-tlvj4","openshift-multus/multus-additional-cni-plugins-m79vj"] Apr 24 14:24:15.094058 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.094034 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:15.094058 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.094051 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:15.094198 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.094136 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:15.096802 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.096782 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.096948 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.096898 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.098693 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.098661 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:19:14 +0000 UTC" deadline="2027-12-15 14:06:29.924906742 +0000 UTC" Apr 24 14:24:15.098693 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.098690 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14399h42m14.826219904s" Apr 24 14:24:15.098886 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.098867 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.098953 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.098933 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.099741 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.099393 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.099741 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.099646 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dqmt8\"" Apr 24 14:24:15.099874 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.099794 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.099925 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.099899 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-88gmb\"" Apr 24 14:24:15.100041 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.099729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:24:15.100294 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-conf\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100294 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-tuned\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100438 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.100438 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-kubernetes\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100438 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-run\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100438 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-sys\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100438 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-tmp\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100650 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58zf2\" (UniqueName: \"kubernetes.io/projected/3307a337-f7bb-48ac-bb80-128ee9a46983-kube-api-access-58zf2\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100650 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:15.100650 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-host-slash\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.100650 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7wh\" (UniqueName: \"kubernetes.io/projected/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-kube-api-access-ws7wh\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.100905 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100609 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysconfig\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.100955 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101006 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.100968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-lib-modules\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101052 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-iptables-alerter-script\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.101052 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-systemd\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101165 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-var-lib-kubelet\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101165 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-host\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101165 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-modprobe-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.101974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xvfpf\"" Apr 24 14:24:15.102059 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.101976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.102128 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.102065 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.102994 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.102982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.103060 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.103046 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:15.104662 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.104641 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:15.105532 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.105517 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.107271 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107252 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:24:15.107400 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.107443 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107410 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.107485 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107415 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vc9gg\"" Apr 24 14:24:15.107520 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:24:15.107834 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.107820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.109807 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.109791 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.111459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111439 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:24:15.111563 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111440 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111648 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111664 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111922 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ml6fg\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111930 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.111970 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dq7l2\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.112079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:24:15.112169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.112089 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:24:15.113044 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.113030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.115172 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.115153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.117378 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.117360 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.121457 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.121437 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:24:15.121530 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.121448 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.121530 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.121481 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.121530 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.121499 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:24:15.125229 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.125212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zx5w5\"" Apr 24 14:24:15.125301 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.125211 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:24:15.126636 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.126618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:24:15.126723 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.126662 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:24:15.126723 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.126676 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-m8lld\"" Apr 24 14:24:15.126723 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.126682 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:24:15.126959 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.126944 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qrfns\"" Apr 24 14:24:15.141023 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.141004 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2nxdb" Apr 24 14:24:15.154509 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.154493 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2nxdb" Apr 24 14:24:15.164944 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.164924 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:15.196284 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.196265 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:24:15.201301 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cni-binary-copy\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201397 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201313 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-daemon-config\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201397 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-env-overrides\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.201397 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovn-node-metrics-cert\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.201397 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wf9\" (UniqueName: \"kubernetes.io/projected/2ded180c-8601-4aaf-86bd-6a13b101faa8-kube-api-access-77wf9\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.201548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-konnectivity-ca\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.201548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-socket-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.201548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201513 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.201548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-system-cni-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2cl\" (UniqueName: \"kubernetes.io/projected/aace953f-49d0-4c47-9522-d00bf8dece62-kube-api-access-zp2cl\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201618 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-hostroot\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-conf-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-multus-certs\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.201722 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201684 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-systemd-units\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-conf\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-log-socket\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-netd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-run\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-sys\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-conf\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-etc-selinux\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-run\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201930 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2eb35e3-b76b-433c-b90d-a4481a2cd709-host\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.202007 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.201968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-var-lib-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-sys\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-node-log\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-host-slash\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-lib-modules\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-ovn\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-systemd\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-k8s-cni-cncf-io\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-host-slash\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-systemd\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202241 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-kubelet\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-lib-modules\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-socket-dir-parent\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-netns\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202358 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-bin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202410 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202390 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-etc-kubernetes\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58zf2\" (UniqueName: \"kubernetes.io/projected/3307a337-f7bb-48ac-bb80-128ee9a46983-kube-api-access-58zf2\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-kubelet\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202547 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-script-lib\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-device-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysconfig\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-var-lib-kubelet\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-host\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysconfig\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-var-lib-kubelet\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-registration-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbpd\" (UniqueName: \"kubernetes.io/projected/7ed1658e-98f8-4fe9-bb01-60b235015d4b-kube-api-access-2nbpd\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.202894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-host\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-netns\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-etc-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjf5\" (UniqueName: \"kubernetes.io/projected/d2eb35e3-b76b-433c-b90d-a4481a2cd709-kube-api-access-8kjf5\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.202993 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-os-release\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2ded180c-8601-4aaf-86bd-6a13b101faa8-hosts-file\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-systemd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-bin\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-kubernetes\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-tmp\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-sys-fs\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-kubernetes\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6xg\" (UniqueName: \"kubernetes.io/projected/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kube-api-access-5t6xg\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cnibin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.203399 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tpz\" (UniqueName: \"kubernetes.io/projected/151dbb1d-0d3a-4890-8076-f774d13b7e70-kube-api-access-69tpz\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7wh\" (UniqueName: \"kubernetes.io/projected/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-kube-api-access-ws7wh\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2eb35e3-b76b-433c-b90d-a4481a2cd709-serviceca\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-cnibin\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203291 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-slash\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203313 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-config\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-iptables-alerter-script\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-system-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203382 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203378 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-multus\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-sysctl-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5r9d\" (UniqueName: \"kubernetes.io/projected/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-kube-api-access-m5r9d\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-modprobe-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-tuned\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203516 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-agent-certs\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.203961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ded180c-8601-4aaf-86bd-6a13b101faa8-tmp-dir\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.204423 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-modprobe-d\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.204423 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203593 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-os-release\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.204423 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.203757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-iptables-alerter-script\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.205912 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.205895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-tmp\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.205974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.205914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3307a337-f7bb-48ac-bb80-128ee9a46983-etc-tuned\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.210792 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.210776 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:15.210792 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.210793 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:15.210901 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.210803 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.210901 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.210852 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:15.710837476 +0000 UTC m=+2.042115240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.212721 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.212699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58zf2\" (UniqueName: \"kubernetes.io/projected/3307a337-f7bb-48ac-bb80-128ee9a46983-kube-api-access-58zf2\") pod \"tuned-xgvkw\" (UID: \"3307a337-f7bb-48ac-bb80-128ee9a46983\") " pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.213407 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.213392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7wh\" (UniqueName: \"kubernetes.io/projected/7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46-kube-api-access-ws7wh\") pod \"iptables-alerter-hwhv7\" (UID: \"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46\") " pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.239006 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.238973 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ded8f515ead39ec1575ef940918d4c.slice/crio-5fa9974e657185d395c867d639d4794a8a38e93e82dc1fbf7fb66dc72532a0a7 WatchSource:0}: Error finding container 5fa9974e657185d395c867d639d4794a8a38e93e82dc1fbf7fb66dc72532a0a7: Status 404 returned error can't find the container with id 5fa9974e657185d395c867d639d4794a8a38e93e82dc1fbf7fb66dc72532a0a7 Apr 24 14:24:15.239243 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.239229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ed165269d2376e0c4ebf616952f5d1.slice/crio-65c3240e586a706331537d56bac3715678bd92d6f849777df1dec49f8037291e WatchSource:0}: Error finding container 65c3240e586a706331537d56bac3715678bd92d6f849777df1dec49f8037291e: Status 404 returned error can't find the container with id 65c3240e586a706331537d56bac3715678bd92d6f849777df1dec49f8037291e Apr 24 14:24:15.243348 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.243329 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:24:15.269339 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.268133 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:15.304179 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-kubelet\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304183 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-socket-dir-parent\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-netns\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-bin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-etc-kubernetes\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-netns\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-kubelet\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-socket-dir-parent\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304313 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304322 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-bin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-etc-kubernetes\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-kubelet\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-kubelet\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-script-lib\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-device-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304506 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-registration-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbpd\" (UniqueName: \"kubernetes.io/projected/7ed1658e-98f8-4fe9-bb01-60b235015d4b-kube-api-access-2nbpd\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-netns\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-registration-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-etc-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-device-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-run-netns\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304657 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-etc-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304697 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjf5\" (UniqueName: \"kubernetes.io/projected/d2eb35e3-b76b-433c-b90d-a4481a2cd709-kube-api-access-8kjf5\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-os-release\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2ded180c-8601-4aaf-86bd-6a13b101faa8-hosts-file\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-systemd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-bin\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-sys-fs\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.304897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6xg\" (UniqueName: \"kubernetes.io/projected/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kube-api-access-5t6xg\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cnibin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-os-release\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2ded180c-8601-4aaf-86bd-6a13b101faa8-hosts-file\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69tpz\" (UniqueName: \"kubernetes.io/projected/151dbb1d-0d3a-4890-8076-f774d13b7e70-kube-api-access-69tpz\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-bin\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-script-lib\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cnibin\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2eb35e3-b76b-433c-b90d-a4481a2cd709-serviceca\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-systemd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-sys-fs\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.304993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-cnibin\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-cnibin\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-slash\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-slash\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-config\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-system-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.305649 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-multus\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5r9d\" (UniqueName: \"kubernetes.io/projected/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-kube-api-access-m5r9d\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-system-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-agent-certs\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-var-lib-cni-multus\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ded180c-8601-4aaf-86bd-6a13b101faa8-tmp-dir\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-os-release\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cni-binary-copy\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-daemon-config\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-env-overrides\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovn-node-metrics-cert\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2eb35e3-b76b-433c-b90d-a4481a2cd709-serviceca\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305405 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77wf9\" (UniqueName: \"kubernetes.io/projected/2ded180c-8601-4aaf-86bd-6a13b101faa8-kube-api-access-77wf9\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-os-release\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-konnectivity-ca\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.306527 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-socket-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ded180c-8601-4aaf-86bd-6a13b101faa8-tmp-dir\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-system-cni-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2cl\" (UniqueName: \"kubernetes.io/projected/aace953f-49d0-4c47-9522-d00bf8dece62-kube-api-access-zp2cl\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-socket-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-hostroot\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-conf-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-multus-certs\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-systemd-units\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.305818 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-log-socket\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aace953f-49d0-4c47-9522-d00bf8dece62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-netd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.305888 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:15.805869261 +0000 UTC m=+2.137147008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:15.307366 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-conf-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-multus-certs\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-etc-selinux\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-systemd-units\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2eb35e3-b76b-433c-b90d-a4481a2cd709-host\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305961 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-konnectivity-ca\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-var-lib-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-node-log\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-node-log\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-ovn\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306083 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-run-ovn\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-k8s-cni-cncf-io\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-host-run-k8s-cni-cncf-io\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-cni-dir\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-log-socket\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-host-cni-netd\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.305933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-env-overrides\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-etc-selinux\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2eb35e3-b76b-433c-b90d-a4481a2cd709-host\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aace953f-49d0-4c47-9522-d00bf8dece62-system-cni-dir\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151dbb1d-0d3a-4890-8076-f774d13b7e70-var-lib-openvswitch\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovnkube-config\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-hostroot\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.306697 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-multus-daemon-config\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.307197 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-cni-binary-copy\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.307989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1e92bec3-9630-4928-b6e4-dcf3fbc8dd82-agent-certs\") pod \"konnectivity-agent-f25sj\" (UID: \"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82\") " pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.308510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.307996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151dbb1d-0d3a-4890-8076-f774d13b7e70-ovn-node-metrics-cert\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.312075 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.312055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjf5\" (UniqueName: \"kubernetes.io/projected/d2eb35e3-b76b-433c-b90d-a4481a2cd709-kube-api-access-8kjf5\") pod \"node-ca-tlvj4\" (UID: \"d2eb35e3-b76b-433c-b90d-a4481a2cd709\") " pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.312510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.312492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tpz\" (UniqueName: \"kubernetes.io/projected/151dbb1d-0d3a-4890-8076-f774d13b7e70-kube-api-access-69tpz\") pod \"ovnkube-node-7ksfw\" (UID: \"151dbb1d-0d3a-4890-8076-f774d13b7e70\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.312606 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.312580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6xg\" (UniqueName: \"kubernetes.io/projected/0d48f14b-bdc0-4862-a203-5b9e2cb1299b-kube-api-access-5t6xg\") pod \"aws-ebs-csi-driver-node-rsqtc\" (UID: \"0d48f14b-bdc0-4862-a203-5b9e2cb1299b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.313113 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.313077 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbpd\" (UniqueName: \"kubernetes.io/projected/7ed1658e-98f8-4fe9-bb01-60b235015d4b-kube-api-access-2nbpd\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.313314 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.313299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5r9d\" (UniqueName: \"kubernetes.io/projected/75370bd4-7795-4ebf-8a12-27eda2d9b1d7-kube-api-access-m5r9d\") pod \"multus-jxm64\" (UID: \"75370bd4-7795-4ebf-8a12-27eda2d9b1d7\") " pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.313869 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.313853 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2cl\" (UniqueName: \"kubernetes.io/projected/aace953f-49d0-4c47-9522-d00bf8dece62-kube-api-access-zp2cl\") pod \"multus-additional-cni-plugins-m79vj\" (UID: \"aace953f-49d0-4c47-9522-d00bf8dece62\") " pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.313869 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.313866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wf9\" (UniqueName: \"kubernetes.io/projected/2ded180c-8601-4aaf-86bd-6a13b101faa8-kube-api-access-77wf9\") pod \"node-resolver-2rs7b\" (UID: \"2ded180c-8601-4aaf-86bd-6a13b101faa8\") " pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.429576 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.429518 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hwhv7" Apr 24 14:24:15.435336 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.435315 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b650bb0_88a4_4b81_a9d8_a1f2b16a8c46.slice/crio-c0c89d195bb262b23fe68913b8fcf7fabe53cdf45a29441557ef7a47eb15aa1f WatchSource:0}: Error finding container c0c89d195bb262b23fe68913b8fcf7fabe53cdf45a29441557ef7a47eb15aa1f: Status 404 returned error can't find the container with id c0c89d195bb262b23fe68913b8fcf7fabe53cdf45a29441557ef7a47eb15aa1f Apr 24 14:24:15.448869 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.448854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" Apr 24 14:24:15.452325 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.452307 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2rs7b" Apr 24 14:24:15.454279 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.454255 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3307a337_f7bb_48ac_bb80_128ee9a46983.slice/crio-dfdb75982a0e7db1f1e8d36a4d9094dc1ea2467e2a7ae2ae9ebe4cbeed1dbd83 WatchSource:0}: Error finding container dfdb75982a0e7db1f1e8d36a4d9094dc1ea2467e2a7ae2ae9ebe4cbeed1dbd83: Status 404 returned error can't find the container with id dfdb75982a0e7db1f1e8d36a4d9094dc1ea2467e2a7ae2ae9ebe4cbeed1dbd83 Apr 24 14:24:15.458846 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.458827 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ded180c_8601_4aaf_86bd_6a13b101faa8.slice/crio-fbc9706649181c621caa56414e3eb504d844035f5377e9c0bee0227612b221aa WatchSource:0}: Error finding container fbc9706649181c621caa56414e3eb504d844035f5377e9c0bee0227612b221aa: Status 404 returned error can't find the container with id fbc9706649181c621caa56414e3eb504d844035f5377e9c0bee0227612b221aa Apr 24 14:24:15.484771 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.484751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxm64" Apr 24 14:24:15.490309 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.490295 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:15.496331 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.496312 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151dbb1d_0d3a_4890_8076_f774d13b7e70.slice/crio-5fef62a7ac35d2e7a9818b8dee22ed270791e82362ff1e28eac645162bbadb7c WatchSource:0}: Error finding container 5fef62a7ac35d2e7a9818b8dee22ed270791e82362ff1e28eac645162bbadb7c: Status 404 returned error can't find the container with id 5fef62a7ac35d2e7a9818b8dee22ed270791e82362ff1e28eac645162bbadb7c Apr 24 14:24:15.521109 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.521076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:15.526242 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.526223 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e92bec3_9630_4928_b6e4_dcf3fbc8dd82.slice/crio-7cad192ade9b4fbfdab341ae52d562e1fc8dc9bf01f34fc8f843162c1ab23909 WatchSource:0}: Error finding container 7cad192ade9b4fbfdab341ae52d562e1fc8dc9bf01f34fc8f843162c1ab23909: Status 404 returned error can't find the container with id 7cad192ade9b4fbfdab341ae52d562e1fc8dc9bf01f34fc8f843162c1ab23909 Apr 24 14:24:15.540039 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.540023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" Apr 24 14:24:15.544764 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.544743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tlvj4" Apr 24 14:24:15.544972 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.544953 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d48f14b_bdc0_4862_a203_5b9e2cb1299b.slice/crio-b8a8ff3c03646747b9ce28a97aa7268d44a96d6ad79fccaa15279f9b57eb14ca WatchSource:0}: Error finding container b8a8ff3c03646747b9ce28a97aa7268d44a96d6ad79fccaa15279f9b57eb14ca: Status 404 returned error can't find the container with id b8a8ff3c03646747b9ce28a97aa7268d44a96d6ad79fccaa15279f9b57eb14ca Apr 24 14:24:15.549571 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.549553 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m79vj" Apr 24 14:24:15.549775 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.549754 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2eb35e3_b76b_433c_b90d_a4481a2cd709.slice/crio-bd620675c2355ddc36051e9100b464eca68e75885e8c61c69b67a9fe4f6718cd WatchSource:0}: Error finding container bd620675c2355ddc36051e9100b464eca68e75885e8c61c69b67a9fe4f6718cd: Status 404 returned error can't find the container with id bd620675c2355ddc36051e9100b464eca68e75885e8c61c69b67a9fe4f6718cd Apr 24 14:24:15.554970 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:15.554953 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaace953f_49d0_4c47_9522_d00bf8dece62.slice/crio-c859fb1b59be03311605530e0fa26ae8cd56bf02e5882c723c8207259f0df1d9 WatchSource:0}: Error finding container c859fb1b59be03311605530e0fa26ae8cd56bf02e5882c723c8207259f0df1d9: Status 404 returned error can't find the container with id c859fb1b59be03311605530e0fa26ae8cd56bf02e5882c723c8207259f0df1d9 Apr 24 14:24:15.809155 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.809123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:15.809328 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:15.809192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:15.809394 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809327 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:15.809394 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809345 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:15.809394 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809356 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.809530 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809411 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:16.809393021 +0000 UTC m=+3.140670780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.809858 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809838 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:15.809943 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:15.809898 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:16.809881135 +0000 UTC m=+3.141158897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:16.155932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.155507 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:15 +0000 UTC" deadline="2027-11-13 09:14:15.624321508 +0000 UTC" Apr 24 14:24:16.155932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.155763 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13626h49m59.468564155s" Apr 24 14:24:16.255277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.255213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" event={"ID":"63ded8f515ead39ec1575ef940918d4c","Type":"ContainerStarted","Data":"5fa9974e657185d395c867d639d4794a8a38e93e82dc1fbf7fb66dc72532a0a7"} Apr 24 14:24:16.280298 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.280262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" event={"ID":"67ed165269d2376e0c4ebf616952f5d1","Type":"ContainerStarted","Data":"65c3240e586a706331537d56bac3715678bd92d6f849777df1dec49f8037291e"} Apr 24 14:24:16.289314 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.289196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"5fef62a7ac35d2e7a9818b8dee22ed270791e82362ff1e28eac645162bbadb7c"} Apr 24 14:24:16.304296 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.304268 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxm64" event={"ID":"75370bd4-7795-4ebf-8a12-27eda2d9b1d7","Type":"ContainerStarted","Data":"5682343c305da97afe98bf0c032c50febf03d4e14923f30059b7f707a06f60e3"} Apr 24 14:24:16.328701 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.328680 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:16.331778 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.331716 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" event={"ID":"3307a337-f7bb-48ac-bb80-128ee9a46983","Type":"ContainerStarted","Data":"dfdb75982a0e7db1f1e8d36a4d9094dc1ea2467e2a7ae2ae9ebe4cbeed1dbd83"} Apr 24 14:24:16.334117 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.334027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerStarted","Data":"c859fb1b59be03311605530e0fa26ae8cd56bf02e5882c723c8207259f0df1d9"} Apr 24 14:24:16.354428 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.354404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tlvj4" event={"ID":"d2eb35e3-b76b-433c-b90d-a4481a2cd709","Type":"ContainerStarted","Data":"bd620675c2355ddc36051e9100b464eca68e75885e8c61c69b67a9fe4f6718cd"} Apr 24 14:24:16.383135 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.383093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" event={"ID":"0d48f14b-bdc0-4862-a203-5b9e2cb1299b","Type":"ContainerStarted","Data":"b8a8ff3c03646747b9ce28a97aa7268d44a96d6ad79fccaa15279f9b57eb14ca"} Apr 24 14:24:16.404767 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.404736 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f25sj" event={"ID":"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82","Type":"ContainerStarted","Data":"7cad192ade9b4fbfdab341ae52d562e1fc8dc9bf01f34fc8f843162c1ab23909"} Apr 24 14:24:16.420920 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.420847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2rs7b" event={"ID":"2ded180c-8601-4aaf-86bd-6a13b101faa8","Type":"ContainerStarted","Data":"fbc9706649181c621caa56414e3eb504d844035f5377e9c0bee0227612b221aa"} Apr 24 14:24:16.424211 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.424158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hwhv7" event={"ID":"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46","Type":"ContainerStarted","Data":"c0c89d195bb262b23fe68913b8fcf7fabe53cdf45a29441557ef7a47eb15aa1f"} Apr 24 14:24:16.821599 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.821375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:16.821599 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:16.821441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:16.821599 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.821587 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:16.821599 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.821605 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:16.821920 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.821619 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:16.821920 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.821683 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.821662473 +0000 UTC m=+5.152940217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:16.822200 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.822112 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:16.822200 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:16.822169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.822153572 +0000 UTC m=+5.153431320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:17.065702 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:17.065667 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:17.156405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:17.156271 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:15 +0000 UTC" deadline="2027-12-12 04:28:26.833228653 +0000 UTC" Apr 24 14:24:17.156405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:17.156312 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14318h4m9.676922516s" Apr 24 14:24:17.225622 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:17.225119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:17.225622 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:17.225251 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:17.225622 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:17.225268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:17.225622 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:17.225386 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:18.838739 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:18.838694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:18.839253 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:18.838767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:18.839253 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.838910 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:18.839253 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.838972 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:22.838953707 +0000 UTC m=+9.170231455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:18.839434 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.839402 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:18.839434 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.839420 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:18.839434 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.839431 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:18.839569 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:18.839485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:22.839469347 +0000 UTC m=+9.170747096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:19.225614 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:19.225285 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:19.225614 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:19.225452 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:19.225821 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:19.225644 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:19.225821 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:19.225739 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:21.225514 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:21.224651 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:21.225514 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:21.224867 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:21.225514 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:21.225367 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:21.225514 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:21.225454 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:22.873506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:22.873570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873700 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873718 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873730 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873812 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873819 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.87376715 +0000 UTC m=+17.205044909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:22.874049 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:22.873891 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.873872977 +0000 UTC m=+17.205150731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:23.226042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:23.225352 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:23.226042 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:23.225492 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:23.226042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:23.225857 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:23.226042 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:23.225942 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:25.225316 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:25.225281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:25.225745 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:25.225279 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:25.225745 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:25.225404 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:25.225745 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:25.225456 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:27.224637 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:27.224602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:27.225082 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:27.224602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:27.225082 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:27.224726 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:27.225082 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:27.224834 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:29.224882 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:29.224848 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:29.225363 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:29.224855 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:29.225363 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:29.224975 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:29.225363 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:29.225047 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:30.936967 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:30.936930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:30.937004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937085 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937109 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937123 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937137 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vv982 for pod openshift-network-diagnostics/network-check-target-jlk8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937179 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.937157269 +0000 UTC m=+33.268435013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:30.937452 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:30.937199 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982 podName:0a1eaa98-906e-4458-8492-83342d8bdd0f nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.937190051 +0000 UTC m=+33.268467793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vv982" (UniqueName: "kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982") pod "network-check-target-jlk8v" (UID: "0a1eaa98-906e-4458-8492-83342d8bdd0f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:31.225332 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:31.225270 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:31.225451 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:31.225275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:31.225451 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:31.225380 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:31.225451 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:31.225433 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:33.224676 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.224653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:33.224921 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.224654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:33.224921 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:33.224755 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:33.224921 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:33.224826 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:33.467375 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.467193 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxm64" event={"ID":"75370bd4-7795-4ebf-8a12-27eda2d9b1d7","Type":"ContainerStarted","Data":"96980032acebb4a7f4b6436de88e0d09c61695da340559817409eda46221f20a"} Apr 24 14:24:33.469268 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.469235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" event={"ID":"3307a337-f7bb-48ac-bb80-128ee9a46983","Type":"ContainerStarted","Data":"563ab9b4f86e7db11b96bd890c4f6d3e54ba9c530cd9d45ed8806df69f149cf6"} Apr 24 14:24:33.471258 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.471157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" event={"ID":"63ded8f515ead39ec1575ef940918d4c","Type":"ContainerStarted","Data":"fc2835e403ca52e8eed5f6ae358ee2c3fb6038a3a9b3ee7327984e20334e0612"} Apr 24 14:24:33.473519 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.473499 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:24:33.474128 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.474079 2571 generic.go:358] "Generic (PLEG): container finished" podID="151dbb1d-0d3a-4890-8076-f774d13b7e70" containerID="bb959d61b83801e64a1684073146898e95f53149092d449972b4da6685d32e04" exitCode=1 Apr 24 14:24:33.474207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.474141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"55d70ded769340cf06e6c71614ceee79ef5cff7c62a8ef6a6bcc5a7f30d49919"} Apr 24 14:24:33.474207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.474164 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerDied","Data":"bb959d61b83801e64a1684073146898e95f53149092d449972b4da6685d32e04"} Apr 24 14:24:33.474207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.474180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"499b57ae2bd7a47340a1f67d861989fa6fba21d22e1a0a39287f898f5a579521"} Apr 24 14:24:33.490162 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.490079 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jxm64" podStartSLOduration=1.666036112 podStartE2EDuration="19.490059786s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.492447497 +0000 UTC m=+1.823725254" lastFinishedPulling="2026-04-24 14:24:33.316471186 +0000 UTC m=+19.647748928" observedRunningTime="2026-04-24 14:24:33.489256436 +0000 UTC m=+19.820534202" watchObservedRunningTime="2026-04-24 14:24:33.490059786 +0000 UTC m=+19.821337552" Apr 24 14:24:33.506455 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:33.506263 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xgvkw" podStartSLOduration=1.981759045 podStartE2EDuration="19.506245093s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.456093513 +0000 UTC m=+1.787371256" lastFinishedPulling="2026-04-24 14:24:32.980579554 +0000 UTC m=+19.311857304" observedRunningTime="2026-04-24 14:24:33.505869302 +0000 UTC m=+19.837147089" watchObservedRunningTime="2026-04-24 14:24:33.506245093 +0000 UTC m=+19.837522862" Apr 24 14:24:34.477328 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.477173 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="7de7878cec0753cbda1d96b40198b5e0bced23fed701c7fc343c6be60232180b" exitCode=0 Apr 24 14:24:34.478281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.477254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"7de7878cec0753cbda1d96b40198b5e0bced23fed701c7fc343c6be60232180b"} Apr 24 14:24:34.478976 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.478955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tlvj4" event={"ID":"d2eb35e3-b76b-433c-b90d-a4481a2cd709","Type":"ContainerStarted","Data":"aaaa9e994b49a5617655965a95ddfd753589958195271a9cc7aa30c3d678f4e4"} Apr 24 14:24:34.480474 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.480448 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" event={"ID":"0d48f14b-bdc0-4862-a203-5b9e2cb1299b","Type":"ContainerStarted","Data":"fef84a26d91b9e78e32c0b7f3bdeb384c3773abd6b2b97365e59e4403c373782"} Apr 24 14:24:34.481738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.481714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f25sj" event={"ID":"1e92bec3-9630-4928-b6e4-dcf3fbc8dd82","Type":"ContainerStarted","Data":"2bfb913ef682ed5c1b44370bed583109c810de527a193128cb130551a6f060e2"} Apr 24 14:24:34.483059 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.483041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2rs7b" event={"ID":"2ded180c-8601-4aaf-86bd-6a13b101faa8","Type":"ContainerStarted","Data":"933d14015fbe6fc17971f47f44fe7a9d97ab9e3bd87ce2056479ce1ca9370423"} Apr 24 14:24:34.484630 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.484608 2571 generic.go:358] "Generic (PLEG): container finished" podID="67ed165269d2376e0c4ebf616952f5d1" containerID="281fce0ae2bade1ac57494c3d57f61d4fd1be7745aaae981e48f9242ac5258cd" exitCode=0 Apr 24 14:24:34.484715 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.484675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" event={"ID":"67ed165269d2376e0c4ebf616952f5d1","Type":"ContainerDied","Data":"281fce0ae2bade1ac57494c3d57f61d4fd1be7745aaae981e48f9242ac5258cd"} Apr 24 14:24:34.487647 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.487367 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:24:34.489180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.489158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"bcce019109afbf9cba1c0cb2c36c7c0f49df7f8acdd221fdfa2d6863a9c2d85e"} Apr 24 14:24:34.489180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.489184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"68ad5705471174c9e2a6a19fe428bcccadc40b0cf2413d2a4f4bcf9dd6218812"} Apr 24 14:24:34.489180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.489193 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"d8aa0aa17c5aa9e2f9ddabbd0472fcf55cb7448ced02a3b49740dff278373b26"} Apr 24 14:24:34.499851 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.499812 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-116.ec2.internal" podStartSLOduration=19.499797537 podStartE2EDuration="19.499797537s" podCreationTimestamp="2026-04-24 14:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:33.520627327 +0000 UTC m=+19.851905091" watchObservedRunningTime="2026-04-24 14:24:34.499797537 +0000 UTC m=+20.831075302" Apr 24 14:24:34.530261 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.530221 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2rs7b" podStartSLOduration=2.946654421 podStartE2EDuration="20.530208659s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.460285568 +0000 UTC m=+1.791563310" lastFinishedPulling="2026-04-24 14:24:33.043839793 +0000 UTC m=+19.375117548" observedRunningTime="2026-04-24 14:24:34.529958734 +0000 UTC m=+20.861236498" watchObservedRunningTime="2026-04-24 14:24:34.530208659 +0000 UTC m=+20.861486424" Apr 24 14:24:34.544174 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.544126 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f25sj" podStartSLOduration=3.032160864 podStartE2EDuration="20.544094809s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.527496751 +0000 UTC m=+1.858774497" lastFinishedPulling="2026-04-24 14:24:33.039430684 +0000 UTC m=+19.370708442" observedRunningTime="2026-04-24 14:24:34.543026137 +0000 UTC m=+20.874303893" watchObservedRunningTime="2026-04-24 14:24:34.544094809 +0000 UTC m=+20.875372565" Apr 24 14:24:34.557523 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.557486 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tlvj4" podStartSLOduration=3.071903707 podStartE2EDuration="20.557476177s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.552144669 +0000 UTC m=+1.883422415" lastFinishedPulling="2026-04-24 14:24:33.037717142 +0000 UTC m=+19.368994885" observedRunningTime="2026-04-24 14:24:34.557199036 +0000 UTC m=+20.888476798" watchObservedRunningTime="2026-04-24 14:24:34.557476177 +0000 UTC m=+20.888753941" Apr 24 14:24:34.781972 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:34.781948 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:35.165204 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.165080 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:34.781962871Z","UUID":"3b4cb1ae-dc31-4dde-88a3-f62b2fa48b75","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:35.167075 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.167051 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:35.167075 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.167079 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:35.225004 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.224970 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:35.225165 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.225020 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:35.225165 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:35.225078 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:35.225261 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:35.225211 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:35.493741 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.493654 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" event={"ID":"67ed165269d2376e0c4ebf616952f5d1","Type":"ContainerStarted","Data":"a8ad36a6ad6f8aa305e08d964eee1ca67f9fc72d50b935da5ca166b69bbecd9e"} Apr 24 14:24:35.495697 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.495669 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" event={"ID":"0d48f14b-bdc0-4862-a203-5b9e2cb1299b","Type":"ContainerStarted","Data":"6543820e9fd3a44821b46e6497f421ebbfb5bd55c656c41f88bd747d14065f59"} Apr 24 14:24:35.497256 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.497205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hwhv7" event={"ID":"7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46","Type":"ContainerStarted","Data":"7f956e4b47f5bb43cb81d1f04ef1bf679a3e321f7eea89af4071c607f100a134"} Apr 24 14:24:35.508556 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.508455 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-116.ec2.internal" podStartSLOduration=21.508443915 podStartE2EDuration="21.508443915s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:35.508266752 +0000 UTC m=+21.839544516" watchObservedRunningTime="2026-04-24 14:24:35.508443915 +0000 UTC m=+21.839721677" Apr 24 14:24:35.522323 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:35.522274 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hwhv7" podStartSLOduration=3.921050748 podStartE2EDuration="21.522258033s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.436724174 +0000 UTC m=+1.768001921" lastFinishedPulling="2026-04-24 14:24:33.03793145 +0000 UTC m=+19.369209206" observedRunningTime="2026-04-24 14:24:35.521720376 +0000 UTC m=+21.852998140" watchObservedRunningTime="2026-04-24 14:24:35.522258033 +0000 UTC m=+21.853535798" Apr 24 14:24:36.081187 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.080856 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:36.081482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.081466 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:36.501721 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.501631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" event={"ID":"0d48f14b-bdc0-4862-a203-5b9e2cb1299b","Type":"ContainerStarted","Data":"7cc2021732eee803eee5ac0afd75902265f3a7d3a238e74415f3a3275918467d"} Apr 24 14:24:36.504770 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.504746 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:24:36.505170 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.505143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"bebd642a1670df527a0c65c001b10a9c11a966abd6a053056a29cfb7f00a7695"} Apr 24 14:24:36.505665 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.505644 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:36.506156 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.506138 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f25sj" Apr 24 14:24:36.536009 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:36.535970 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rsqtc" podStartSLOduration=2.51349091 podStartE2EDuration="22.535958188s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.547224291 +0000 UTC m=+1.878502033" lastFinishedPulling="2026-04-24 14:24:35.569691557 +0000 UTC m=+21.900969311" observedRunningTime="2026-04-24 14:24:36.535482797 +0000 UTC m=+22.866760573" watchObservedRunningTime="2026-04-24 14:24:36.535958188 +0000 UTC m=+22.867235951" Apr 24 14:24:37.225555 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:37.225520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:37.225742 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:37.225520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:37.225742 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:37.225641 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:37.225856 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:37.225735 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:38.510584 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.510370 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerStarted","Data":"e0e6187d6cca89d97faf2389fe1ca8d7810d765e4ccf874524e96959f3056a59"} Apr 24 14:24:38.513321 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.513300 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:24:38.513682 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.513662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"d9c5024872fcd65da3bb01e262b6ecdb158ffa260baf3109dfd93cb3eb424dc7"} Apr 24 14:24:38.514065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.514046 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:38.514182 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.514078 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:38.514182 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.514155 2571 scope.go:117] "RemoveContainer" containerID="bb959d61b83801e64a1684073146898e95f53149092d449972b4da6685d32e04" Apr 24 14:24:38.530273 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:38.529600 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:39.225437 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.225406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:39.225575 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.225406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:39.225575 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:39.225507 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:39.225575 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:39.225562 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:39.517956 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.517919 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="e0e6187d6cca89d97faf2389fe1ca8d7810d765e4ccf874524e96959f3056a59" exitCode=0 Apr 24 14:24:39.518417 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.518000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"e0e6187d6cca89d97faf2389fe1ca8d7810d765e4ccf874524e96959f3056a59"} Apr 24 14:24:39.523624 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.523605 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:24:39.524158 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.524030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" event={"ID":"151dbb1d-0d3a-4890-8076-f774d13b7e70","Type":"ContainerStarted","Data":"af62456136a03c3603085b42b31f45fa301ee9c2c6b74fb35cc0a0b6bb106bae"} Apr 24 14:24:39.524529 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.524454 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:39.541688 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.541665 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:24:39.594688 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:39.594043 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" podStartSLOduration=8.030486304 podStartE2EDuration="25.594024883s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.497756842 +0000 UTC m=+1.829034584" lastFinishedPulling="2026-04-24 14:24:33.061295416 +0000 UTC m=+19.392573163" observedRunningTime="2026-04-24 14:24:39.593325744 +0000 UTC m=+25.924603508" watchObservedRunningTime="2026-04-24 14:24:39.594024883 +0000 UTC m=+25.925302647" Apr 24 14:24:40.279040 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.278806 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dkhdd"] Apr 24 14:24:40.279184 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.279169 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:40.279303 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:40.279279 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:40.281352 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.281331 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jlk8v"] Apr 24 14:24:40.281450 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.281417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:40.281506 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:40.281480 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:40.527666 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.527635 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="93fec69af6b99d28e855a7126f22128cec01d25634f9ed4a97bbd29f4458c716" exitCode=0 Apr 24 14:24:40.528141 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:40.527721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"93fec69af6b99d28e855a7126f22128cec01d25634f9ed4a97bbd29f4458c716"} Apr 24 14:24:41.531249 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:41.531217 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="961713f6a8f457ba7a6e005e3b888b6d022193408df8f0d96cf4aac800b17910" exitCode=0 Apr 24 14:24:41.531599 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:41.531261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"961713f6a8f457ba7a6e005e3b888b6d022193408df8f0d96cf4aac800b17910"} Apr 24 14:24:42.224688 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:42.224657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:42.224814 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:42.224791 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:42.224866 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:42.224846 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:42.224964 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:42.224941 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:44.226321 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:44.226290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:44.227052 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:44.226383 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlk8v" podUID="0a1eaa98-906e-4458-8492-83342d8bdd0f" Apr 24 14:24:44.227052 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:44.226466 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:44.227052 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:44.226561 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:24:46.012850 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.012822 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-116.ec2.internal" event="NodeReady" Apr 24 14:24:46.013402 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.012970 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:46.059839 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.059815 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z29nv"] Apr 24 14:24:46.088265 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.088225 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5gpks"] Apr 24 14:24:46.088426 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.088403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.090752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.090721 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:46.090752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.090741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:46.090889 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.090722 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6p8rb\"" Apr 24 14:24:46.104042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.103618 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z29nv"] Apr 24 14:24:46.104042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.103646 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5gpks"] Apr 24 14:24:46.104042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.103742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.106036 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.106018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:46.106364 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.106344 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:46.106502 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.106484 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v25p\"" Apr 24 14:24:46.106591 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.106571 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:46.224691 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.224642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:46.224691 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.224676 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:46.227520 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.227498 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4q5p5\"" Apr 24 14:24:46.229914 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.229896 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:46.229914 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.229910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:46.230135 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.229928 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nctch\"" Apr 24 14:24:46.230135 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.229987 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:46.250196 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jvl\" (UniqueName: \"kubernetes.io/projected/92dbdc96-9b06-45f2-9e4e-317abc345922-kube-api-access-p5jvl\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.250311 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdfv2\" (UniqueName: \"kubernetes.io/projected/aa089429-959e-4b12-bddb-1d6d0ce963c9-kube-api-access-tdfv2\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.250311 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.250402 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dbdc96-9b06-45f2-9e4e-317abc345922-config-volume\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.250402 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dbdc96-9b06-45f2-9e4e-317abc345922-tmp-dir\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.250505 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.250419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.351640 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.351640 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jvl\" (UniqueName: \"kubernetes.io/projected/92dbdc96-9b06-45f2-9e4e-317abc345922-kube-api-access-p5jvl\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdfv2\" (UniqueName: \"kubernetes.io/projected/aa089429-959e-4b12-bddb-1d6d0ce963c9-kube-api-access-tdfv2\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.351725 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.351769 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.351797 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.851776772 +0000 UTC m=+33.183054514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:24:46.351857 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.351815 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:46.851806246 +0000 UTC m=+33.183083989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:24:46.352169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dbdc96-9b06-45f2-9e4e-317abc345922-config-volume\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.352169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.351929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dbdc96-9b06-45f2-9e4e-317abc345922-tmp-dir\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.352297 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.352270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dbdc96-9b06-45f2-9e4e-317abc345922-tmp-dir\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.352573 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.352552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dbdc96-9b06-45f2-9e4e-317abc345922-config-volume\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.362125 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.362086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jvl\" (UniqueName: \"kubernetes.io/projected/92dbdc96-9b06-45f2-9e4e-317abc345922-kube-api-access-p5jvl\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.362381 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.362359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdfv2\" (UniqueName: \"kubernetes.io/projected/aa089429-959e-4b12-bddb-1d6d0ce963c9-kube-api-access-tdfv2\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.855900 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.855721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:46.856125 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.855950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:46.856125 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.855866 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:46.856125 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.856017 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:47.856000336 +0000 UTC m=+34.187278078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:24:46.856125 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.856048 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:46.856306 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.856142 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:47.856086056 +0000 UTC m=+34.187363813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:24:46.956532 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.956503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:24:46.956694 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.956557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:46.956694 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.956661 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:24:46.956763 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:46.956723 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:25:18.956704606 +0000 UTC m=+65.287982349 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : secret "metrics-daemon-secret" not found Apr 24 14:24:46.959169 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:46.959143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv982\" (UniqueName: \"kubernetes.io/projected/0a1eaa98-906e-4458-8492-83342d8bdd0f-kube-api-access-vv982\") pod \"network-check-target-jlk8v\" (UID: \"0a1eaa98-906e-4458-8492-83342d8bdd0f\") " pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:47.136027 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.135958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:47.318660 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.318620 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jlk8v"] Apr 24 14:24:47.322495 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:24:47.322465 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1eaa98_906e_4458_8492_83342d8bdd0f.slice/crio-797b255888ec3a71adefeed9f0b3dae44f4f8aae881fd11d94a4d69da6f67dde WatchSource:0}: Error finding container 797b255888ec3a71adefeed9f0b3dae44f4f8aae881fd11d94a4d69da6f67dde: Status 404 returned error can't find the container with id 797b255888ec3a71adefeed9f0b3dae44f4f8aae881fd11d94a4d69da6f67dde Apr 24 14:24:47.545894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.545858 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="1205ea1ff29ca7975ead60c4b6c09e99ccf8b8ccb0ef319543c3b2e62aeabc08" exitCode=0 Apr 24 14:24:47.546061 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.545938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"1205ea1ff29ca7975ead60c4b6c09e99ccf8b8ccb0ef319543c3b2e62aeabc08"} Apr 24 14:24:47.546988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.546966 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jlk8v" event={"ID":"0a1eaa98-906e-4458-8492-83342d8bdd0f","Type":"ContainerStarted","Data":"797b255888ec3a71adefeed9f0b3dae44f4f8aae881fd11d94a4d69da6f67dde"} Apr 24 14:24:47.863749 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.863716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:47.863931 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:47.863777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:47.863931 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:47.863870 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:47.864037 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:47.863934 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:49.863916006 +0000 UTC m=+36.195193767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:24:47.864037 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:47.863876 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:47.864037 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:47.864019 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:49.864002486 +0000 UTC m=+36.195280245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:24:48.551760 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:48.551718 2571 generic.go:358] "Generic (PLEG): container finished" podID="aace953f-49d0-4c47-9522-d00bf8dece62" containerID="ec1e493485b55bd49549969f93a5d5d6d1cd300123f5839b1bcf345b5a1b23d0" exitCode=0 Apr 24 14:24:48.552272 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:48.551784 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerDied","Data":"ec1e493485b55bd49549969f93a5d5d6d1cd300123f5839b1bcf345b5a1b23d0"} Apr 24 14:24:49.557368 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:49.557177 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79vj" event={"ID":"aace953f-49d0-4c47-9522-d00bf8dece62","Type":"ContainerStarted","Data":"8ad7608cfe44392b7fd88037c34e1cfab5d857ae4fb4d594894c762fc4ccc1d9"} Apr 24 14:24:49.589814 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:49.589763 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m79vj" podStartSLOduration=3.955542138 podStartE2EDuration="35.589745072s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:15.556446616 +0000 UTC m=+1.887724359" lastFinishedPulling="2026-04-24 14:24:47.190649547 +0000 UTC m=+33.521927293" observedRunningTime="2026-04-24 14:24:49.588979674 +0000 UTC m=+35.920257439" watchObservedRunningTime="2026-04-24 14:24:49.589745072 +0000 UTC m=+35.921022838" Apr 24 14:24:49.877425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:49.877345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:49.877425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:49.877418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:49.877627 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:49.877500 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:49.877627 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:49.877517 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:49.877627 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:49.877569 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:53.877554043 +0000 UTC m=+40.208831785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:24:49.877627 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:49.877583 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:53.877577382 +0000 UTC m=+40.208855136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:24:50.560209 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:50.560131 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jlk8v" event={"ID":"0a1eaa98-906e-4458-8492-83342d8bdd0f","Type":"ContainerStarted","Data":"2007c4d22c20735bad9ad599764a8017bce728fb833bed4649bf12967fa88a6c"} Apr 24 14:24:50.560610 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:50.560524 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:24:50.574548 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:50.574506 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jlk8v" podStartSLOduration=33.610092205 podStartE2EDuration="36.574490151s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:24:47.324633007 +0000 UTC m=+33.655910749" lastFinishedPulling="2026-04-24 14:24:50.289030953 +0000 UTC m=+36.620308695" observedRunningTime="2026-04-24 14:24:50.573884799 +0000 UTC m=+36.905162562" watchObservedRunningTime="2026-04-24 14:24:50.574490151 +0000 UTC m=+36.905767914" Apr 24 14:24:53.903936 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:53.903895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:24:53.904479 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:24:53.903951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:24:53.904479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:53.904035 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:53.904479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:53.904046 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:53.904479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:53.904090 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:01.90407572 +0000 UTC m=+48.235353462 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:24:53.904479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:24:53.904120 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:01.90411417 +0000 UTC m=+48.235391913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:25:01.953435 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:01.953392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:25:01.953435 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:01.953446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:25:01.953940 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:01.953537 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:01.953940 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:01.953552 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:01.953940 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:01.953601 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:17.953585328 +0000 UTC m=+64.284863071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:25:01.953940 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:01.953613 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:17.953606933 +0000 UTC m=+64.284884675 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:25:11.542276 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:11.542242 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ksfw" Apr 24 14:25:17.960479 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:17.960437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:25:17.960983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:17.960490 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:25:17.960983 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:17.960576 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:17.960983 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:17.960592 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:17.960983 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:17.960640 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.960626727 +0000 UTC m=+96.291904469 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:25:17.960983 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:17.960677 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.960658607 +0000 UTC m=+96.291936361 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:25:18.966613 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:18.966574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:25:18.967061 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:18.966730 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:25:18.967061 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:18.966861 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:26:22.966839867 +0000 UTC m=+129.298117612 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : secret "metrics-daemon-secret" not found Apr 24 14:25:22.566417 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:22.566382 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jlk8v" Apr 24 14:25:49.969072 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:49.969021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:25:49.969479 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:25:49.969087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:25:49.969479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:49.969185 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:49.969479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:49.969188 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:49.969479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:49.969246 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls podName:92dbdc96-9b06-45f2-9e4e-317abc345922 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:53.96923234 +0000 UTC m=+160.300510088 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls") pod "dns-default-z29nv" (UID: "92dbdc96-9b06-45f2-9e4e-317abc345922") : secret "dns-default-metrics-tls" not found Apr 24 14:25:49.969479 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:25:49.969259 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert podName:aa089429-959e-4b12-bddb-1d6d0ce963c9 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:53.969253748 +0000 UTC m=+160.300531490 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert") pod "ingress-canary-5gpks" (UID: "aa089429-959e-4b12-bddb-1d6d0ce963c9") : secret "canary-serving-cert" not found Apr 24 14:26:17.936267 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.936232 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4"] Apr 24 14:26:17.939050 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.939034 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:17.940917 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.940890 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 14:26:17.941036 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.940932 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:17.941493 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.941471 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lbhj8\"" Apr 24 14:26:17.941493 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.941487 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:17.949618 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:17.949596 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4"] Apr 24 14:26:18.048186 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.048151 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8blnj"] Apr 24 14:26:18.050893 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.050876 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5d8799dfd8-qtxtc"] Apr 24 14:26:18.051022 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.051002 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.051113 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.051081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.051180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.051125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbvj\" (UniqueName: \"kubernetes.io/projected/54fc4557-398b-44ea-9a01-8689be614e8f-kube-api-access-jrbvj\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.053461 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.053446 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.056128 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056091 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 14:26:18.056343 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056329 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 14:26:18.056758 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 14:26:18.056853 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056758 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 14:26:18.056853 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056781 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:26:18.056853 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056745 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 14:26:18.056853 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.056843 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 14:26:18.057074 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.057059 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-cjwbh\"" Apr 24 14:26:18.058934 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.058917 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kvw6c\"" Apr 24 14:26:18.059036 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.059005 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 14:26:18.059125 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.058960 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 14:26:18.059197 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.058978 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:26:18.063714 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.063695 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 14:26:18.066112 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.066076 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8blnj"] Apr 24 14:26:18.069054 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.069035 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d8799dfd8-qtxtc"] Apr 24 14:26:18.146859 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.146827 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:26:18.149665 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.149650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.151845 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-tmp\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.151845 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-snapshots\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.152011 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.152011 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-service-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.152011 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bllhx\" (UniqueName: \"kubernetes.io/projected/dede47ae-02f2-408e-947c-484180d89394-kube-api-access-bllhx\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.152011 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.151987 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:26:18.152207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2zs4\" (UniqueName: \"kubernetes.io/projected/de813c8e-7fcb-4f67-b2eb-58050b724a12-kube-api-access-k2zs4\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.152207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152013 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:26:18.152207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.152207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152088 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2wpbm\"" Apr 24 14:26:18.152207 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-default-certificate\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.152384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.152384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-stats-auth\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.152384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dede47ae-02f2-408e-947c-484180d89394-serving-cert\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.152384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.152384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbvj\" (UniqueName: \"kubernetes.io/projected/54fc4557-398b-44ea-9a01-8689be614e8f-kube-api-access-jrbvj\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.152612 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.152505 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:18.152661 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.152616 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:26:18.152661 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.152622 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls podName:54fc4557-398b-44ea-9a01-8689be614e8f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:18.65260111 +0000 UTC m=+124.983878857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2nbd4" (UID: "54fc4557-398b-44ea-9a01-8689be614e8f") : secret "samples-operator-tls" not found Apr 24 14:26:18.159025 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.159006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:26:18.163372 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.163345 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:26:18.175017 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.175000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbvj\" (UniqueName: \"kubernetes.io/projected/54fc4557-398b-44ea-9a01-8689be614e8f-kube-api-access-jrbvj\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.253557 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.253678 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.253678 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-stats-auth\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.253678 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.253843 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dede47ae-02f2-408e-947c-484180d89394-serving-cert\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.253843 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.253764 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:18.753746913 +0000 UTC m=+125.085024673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:18.253843 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.253990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-tmp\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.253990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-snapshots\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.253990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.253990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-service-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.253990 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.253980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bllhx\" (UniqueName: \"kubernetes.io/projected/dede47ae-02f2-408e-947c-484180d89394-kube-api-access-bllhx\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.254257 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2zs4\" (UniqueName: \"kubernetes.io/projected/de813c8e-7fcb-4f67-b2eb-58050b724a12-kube-api-access-k2zs4\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.254257 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.254360 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-tmp\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.254409 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.254465 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.254465 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwhj\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.254563 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.254563 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-default-certificate\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.254563 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.254718 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-service-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.254718 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dede47ae-02f2-408e-947c-484180d89394-snapshots\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.254718 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.254603 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:18.254718 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.254657 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:18.75463799 +0000 UTC m=+125.085915746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : secret "router-metrics-certs-default" not found Apr 24 14:26:18.254855 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.254786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dede47ae-02f2-408e-947c-484180d89394-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.256016 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.255994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-stats-auth\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.256112 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.255999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dede47ae-02f2-408e-947c-484180d89394-serving-cert\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.257687 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.257670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-default-certificate\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.263902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.263878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2zs4\" (UniqueName: \"kubernetes.io/projected/de813c8e-7fcb-4f67-b2eb-58050b724a12-kube-api-access-k2zs4\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.264145 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.264125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bllhx\" (UniqueName: \"kubernetes.io/projected/dede47ae-02f2-408e-947c-484180d89394-kube-api-access-bllhx\") pod \"insights-operator-585dfdc468-8blnj\" (UID: \"dede47ae-02f2-408e-947c-484180d89394\") " pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.354775 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.354902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.354902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.354902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.354902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwhj\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.354902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.355172 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.355172 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.354963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.355269 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.355175 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:18.355269 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.355195 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77664b688c-txf2p: secret "image-registry-tls" not found Apr 24 14:26:18.355269 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.355260 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls podName:a41a9f0b-b6ca-49c0-9009-04ff82be6d5d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:18.855240809 +0000 UTC m=+125.186518568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls") pod "image-registry-77664b688c-txf2p" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d") : secret "image-registry-tls" not found Apr 24 14:26:18.355428 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.355318 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.355516 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.355492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.355949 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.355926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.356988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.356968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.357203 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.357186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.361113 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.361078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8blnj" Apr 24 14:26:18.366287 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.365545 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.366712 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.366693 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwhj\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.473482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.473450 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8blnj"] Apr 24 14:26:18.476436 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:18.476409 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddede47ae_02f2_408e_947c_484180d89394.slice/crio-396a7846d3020d4559ff9f5569123828b7cc7f39e42ebc2716420e4fdf9664a9 WatchSource:0}: Error finding container 396a7846d3020d4559ff9f5569123828b7cc7f39e42ebc2716420e4fdf9664a9: Status 404 returned error can't find the container with id 396a7846d3020d4559ff9f5569123828b7cc7f39e42ebc2716420e4fdf9664a9 Apr 24 14:26:18.656763 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.656678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:18.656893 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.656829 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:18.656927 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.656897 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls podName:54fc4557-398b-44ea-9a01-8689be614e8f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:19.656879326 +0000 UTC m=+125.988157086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2nbd4" (UID: "54fc4557-398b-44ea-9a01-8689be614e8f") : secret "samples-operator-tls" not found Apr 24 14:26:18.727034 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.727002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8blnj" event={"ID":"dede47ae-02f2-408e-947c-484180d89394","Type":"ContainerStarted","Data":"396a7846d3020d4559ff9f5569123828b7cc7f39e42ebc2716420e4fdf9664a9"} Apr 24 14:26:18.757340 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.757314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.757465 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.757426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:18.757527 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.757470 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:19.757453941 +0000 UTC m=+126.088731707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:18.757588 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.757531 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:18.757628 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.757589 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:19.757571571 +0000 UTC m=+126.088849327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : secret "router-metrics-certs-default" not found Apr 24 14:26:18.858685 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:18.858653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:18.858815 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.858796 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:18.858872 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.858819 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77664b688c-txf2p: secret "image-registry-tls" not found Apr 24 14:26:18.858902 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:18.858873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls podName:a41a9f0b-b6ca-49c0-9009-04ff82be6d5d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:19.858858323 +0000 UTC m=+126.190136070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls") pod "image-registry-77664b688c-txf2p" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d") : secret "image-registry-tls" not found Apr 24 14:26:19.665720 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:19.665683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:19.666147 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.665829 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:19.666147 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.665892 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls podName:54fc4557-398b-44ea-9a01-8689be614e8f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:21.665875432 +0000 UTC m=+127.997153194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2nbd4" (UID: "54fc4557-398b-44ea-9a01-8689be614e8f") : secret "samples-operator-tls" not found Apr 24 14:26:19.767066 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:19.767029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:19.767246 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:19.767094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:19.767246 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.767189 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:19.767339 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.767252 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:21.767236682 +0000 UTC m=+128.098514444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:19.767339 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.767267 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:21.767261445 +0000 UTC m=+128.098539187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : secret "router-metrics-certs-default" not found Apr 24 14:26:19.868350 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:19.868311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:19.868558 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.868443 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:19.868558 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.868462 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77664b688c-txf2p: secret "image-registry-tls" not found Apr 24 14:26:19.868558 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:19.868526 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls podName:a41a9f0b-b6ca-49c0-9009-04ff82be6d5d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:21.868507141 +0000 UTC m=+128.199784888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls") pod "image-registry-77664b688c-txf2p" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d") : secret "image-registry-tls" not found Apr 24 14:26:20.731918 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:20.731883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8blnj" event={"ID":"dede47ae-02f2-408e-947c-484180d89394","Type":"ContainerStarted","Data":"e6c890c9982ca1393284c400ed73d9090ddca04426f321458c94aef3ec13e1e7"} Apr 24 14:26:20.747657 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:20.747611 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8blnj" podStartSLOduration=1.130820312 podStartE2EDuration="2.747597139s" podCreationTimestamp="2026-04-24 14:26:18 +0000 UTC" firstStartedPulling="2026-04-24 14:26:18.478089995 +0000 UTC m=+124.809367737" lastFinishedPulling="2026-04-24 14:26:20.094866821 +0000 UTC m=+126.426144564" observedRunningTime="2026-04-24 14:26:20.746966149 +0000 UTC m=+127.078243912" watchObservedRunningTime="2026-04-24 14:26:20.747597139 +0000 UTC m=+127.078874902" Apr 24 14:26:21.682027 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:21.681984 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:21.682228 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.682134 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:21.682228 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.682192 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls podName:54fc4557-398b-44ea-9a01-8689be614e8f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:25.682178077 +0000 UTC m=+132.013455833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2nbd4" (UID: "54fc4557-398b-44ea-9a01-8689be614e8f") : secret "samples-operator-tls" not found Apr 24 14:26:21.782947 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:21.782910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:21.783386 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:21.782979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:21.783386 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.783127 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:25.783084194 +0000 UTC m=+132.114361952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:21.783386 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.783190 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:21.783386 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.783257 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:25.78324207 +0000 UTC m=+132.114519844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : secret "router-metrics-certs-default" not found Apr 24 14:26:21.883946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:21.883910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:21.884093 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.884053 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:21.884093 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.884075 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77664b688c-txf2p: secret "image-registry-tls" not found Apr 24 14:26:21.884186 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:21.884150 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls podName:a41a9f0b-b6ca-49c0-9009-04ff82be6d5d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:25.884135244 +0000 UTC m=+132.215413006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls") pod "image-registry-77664b688c-txf2p" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d") : secret "image-registry-tls" not found Apr 24 14:26:22.991246 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:22.991205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:26:22.991696 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:22.991370 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:22.991696 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:22.991457 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs podName:7ed1658e-98f8-4fe9-bb01-60b235015d4b nodeName:}" failed. No retries permitted until 2026-04-24 14:28:24.991436257 +0000 UTC m=+251.322714003 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs") pod "network-metrics-daemon-dkhdd" (UID: "7ed1658e-98f8-4fe9-bb01-60b235015d4b") : secret "metrics-daemon-secret" not found Apr 24 14:26:23.722820 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:23.722794 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2rs7b_2ded180c-8601-4aaf-86bd-6a13b101faa8/dns-node-resolver/0.log" Apr 24 14:26:25.126930 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:25.126905 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tlvj4_d2eb35e3-b76b-433c-b90d-a4481a2cd709/node-ca/0.log" Apr 24 14:26:25.713159 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:25.713126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:25.713327 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.713269 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:26:25.713366 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.713332 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls podName:54fc4557-398b-44ea-9a01-8689be614e8f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.713317009 +0000 UTC m=+140.044594755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2nbd4" (UID: "54fc4557-398b-44ea-9a01-8689be614e8f") : secret "samples-operator-tls" not found Apr 24 14:26:25.814544 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:25.814501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:25.814700 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:25.814561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:25.814700 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.814631 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:25.814700 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.814667 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.814650886 +0000 UTC m=+140.145928642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:25.814700 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.814689 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs podName:de813c8e-7fcb-4f67-b2eb-58050b724a12 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.814682262 +0000 UTC m=+140.145960005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs") pod "router-default-5d8799dfd8-qtxtc" (UID: "de813c8e-7fcb-4f67-b2eb-58050b724a12") : secret "router-metrics-certs-default" not found Apr 24 14:26:25.915377 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:25.915347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:25.915500 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.915448 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:25.915500 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.915458 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77664b688c-txf2p: secret "image-registry-tls" not found Apr 24 14:26:25.915500 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:25.915499 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls podName:a41a9f0b-b6ca-49c0-9009-04ff82be6d5d nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.915487235 +0000 UTC m=+140.246764977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls") pod "image-registry-77664b688c-txf2p" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d") : secret "image-registry-tls" not found Apr 24 14:26:33.775352 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.775319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:33.777648 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.777616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54fc4557-398b-44ea-9a01-8689be614e8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2nbd4\" (UID: \"54fc4557-398b-44ea-9a01-8689be614e8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:33.848907 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.848880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" Apr 24 14:26:33.876715 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.876684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:33.876827 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.876769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:33.877247 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.877230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de813c8e-7fcb-4f67-b2eb-58050b724a12-service-ca-bundle\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:33.879360 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.879340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de813c8e-7fcb-4f67-b2eb-58050b724a12-metrics-certs\") pod \"router-default-5d8799dfd8-qtxtc\" (UID: \"de813c8e-7fcb-4f67-b2eb-58050b724a12\") " pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:33.957988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.957956 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4"] Apr 24 14:26:33.965573 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.965547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:33.977048 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.977024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:33.979606 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:33.979583 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"image-registry-77664b688c-txf2p\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:34.058458 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.058425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:34.087420 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.087390 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d8799dfd8-qtxtc"] Apr 24 14:26:34.090335 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:34.090302 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde813c8e_7fcb_4f67_b2eb_58050b724a12.slice/crio-a191955bc1cafd03af5911b9101566b0b0a058a3d9c755b071ababa644c61f8d WatchSource:0}: Error finding container a191955bc1cafd03af5911b9101566b0b0a058a3d9c755b071ababa644c61f8d: Status 404 returned error can't find the container with id a191955bc1cafd03af5911b9101566b0b0a058a3d9c755b071ababa644c61f8d Apr 24 14:26:34.185665 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.185637 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:26:34.189166 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:34.189138 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda41a9f0b_b6ca_49c0_9009_04ff82be6d5d.slice/crio-5cd34177334915cb0b03a5c900ca93ad06ae071229cf475282a2157892f1190e WatchSource:0}: Error finding container 5cd34177334915cb0b03a5c900ca93ad06ae071229cf475282a2157892f1190e: Status 404 returned error can't find the container with id 5cd34177334915cb0b03a5c900ca93ad06ae071229cf475282a2157892f1190e Apr 24 14:26:34.760379 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.760342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77664b688c-txf2p" event={"ID":"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d","Type":"ContainerStarted","Data":"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594"} Apr 24 14:26:34.760379 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.760384 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77664b688c-txf2p" event={"ID":"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d","Type":"ContainerStarted","Data":"5cd34177334915cb0b03a5c900ca93ad06ae071229cf475282a2157892f1190e"} Apr 24 14:26:34.760632 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.760435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:34.761766 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.761739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" event={"ID":"de813c8e-7fcb-4f67-b2eb-58050b724a12","Type":"ContainerStarted","Data":"b161596180f20a161365b5fd3f137a5b9a8c6e06401a50f1842923ad77deeff5"} Apr 24 14:26:34.761865 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.761768 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" event={"ID":"de813c8e-7fcb-4f67-b2eb-58050b724a12","Type":"ContainerStarted","Data":"a191955bc1cafd03af5911b9101566b0b0a058a3d9c755b071ababa644c61f8d"} Apr 24 14:26:34.762797 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.762774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" event={"ID":"54fc4557-398b-44ea-9a01-8689be614e8f","Type":"ContainerStarted","Data":"3bcd12fdf83781afca0563b4f06dfd5a5e9b990cba4782b8ee1a5deb3b34960d"} Apr 24 14:26:34.779176 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.779132 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77664b688c-txf2p" podStartSLOduration=16.779115962 podStartE2EDuration="16.779115962s" podCreationTimestamp="2026-04-24 14:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:34.777953278 +0000 UTC m=+141.109231042" watchObservedRunningTime="2026-04-24 14:26:34.779115962 +0000 UTC m=+141.110393734" Apr 24 14:26:34.798716 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.798664 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" podStartSLOduration=16.79864809 podStartE2EDuration="16.79864809s" podCreationTimestamp="2026-04-24 14:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:34.797962666 +0000 UTC m=+141.129240454" watchObservedRunningTime="2026-04-24 14:26:34.79864809 +0000 UTC m=+141.129925856" Apr 24 14:26:34.966040 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.965985 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:34.968988 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:34.968967 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:35.767176 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:35.767138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" event={"ID":"54fc4557-398b-44ea-9a01-8689be614e8f","Type":"ContainerStarted","Data":"6be1c4a21abb1cfc4f2cc39a668bffa9024a5ac05606cdfa094ef15f5f33679d"} Apr 24 14:26:35.767316 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:35.767185 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" event={"ID":"54fc4557-398b-44ea-9a01-8689be614e8f","Type":"ContainerStarted","Data":"767a5670aa81bfebeb2ce84a202d0e0f1f9bd888b704946d93abfbe5ef7536ee"} Apr 24 14:26:35.767611 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:35.767589 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:35.768659 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:35.768636 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5d8799dfd8-qtxtc" Apr 24 14:26:35.781167 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:35.781126 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2nbd4" podStartSLOduration=17.178332656 podStartE2EDuration="18.781089376s" podCreationTimestamp="2026-04-24 14:26:17 +0000 UTC" firstStartedPulling="2026-04-24 14:26:34.004618606 +0000 UTC m=+140.335896348" lastFinishedPulling="2026-04-24 14:26:35.607375321 +0000 UTC m=+141.938653068" observedRunningTime="2026-04-24 14:26:35.780534166 +0000 UTC m=+142.111811931" watchObservedRunningTime="2026-04-24 14:26:35.781089376 +0000 UTC m=+142.112367140" Apr 24 14:26:47.467952 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.467915 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:26:47.476041 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.476017 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-z8s9x"] Apr 24 14:26:47.480536 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.480519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.483640 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.483622 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:26:47.483869 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.483853 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zf9np\"" Apr 24 14:26:47.486281 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.486262 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:26:47.517544 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.517512 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z8s9x"] Apr 24 14:26:47.531724 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.531699 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d475cf49c-4txv4"] Apr 24 14:26:47.534405 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.534387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.548958 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.548940 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d475cf49c-4txv4"] Apr 24 14:26:47.553427 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.553404 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tpm5c"] Apr 24 14:26:47.556177 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.556162 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:26:47.558570 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.558540 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 14:26:47.558660 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.558600 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 14:26:47.558660 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.558635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-n5vt8\"" Apr 24 14:26:47.569215 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.569197 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tpm5c"] Apr 24 14:26:47.572430 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.572412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6836012d-7ba1-41cd-a9e7-bac6c71b7852-data-volume\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.572556 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.572535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44p2n\" (UniqueName: \"kubernetes.io/projected/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-api-access-44p2n\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.572615 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.572574 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.572754 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.572664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6836012d-7ba1-41cd-a9e7-bac6c71b7852-crio-socket\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.572754 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.572718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6836012d-7ba1-41cd-a9e7-bac6c71b7852-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673639 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c497767-0280-48b6-a885-0915f5cc5c12-ca-trust-extracted\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.673639 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-installation-pull-secrets\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.673796 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44p2n\" (UniqueName: \"kubernetes.io/projected/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-api-access-44p2n\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673796 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64nhm\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-kube-api-access-64nhm\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.673796 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-image-registry-private-configuration\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.673796 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6836012d-7ba1-41cd-a9e7-bac6c71b7852-crio-socket\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-trusted-ca\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6836012d-7ba1-41cd-a9e7-bac6c71b7852-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673888 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6836012d-7ba1-41cd-a9e7-bac6c71b7852-crio-socket\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsjtl\" (UniqueName: \"kubernetes.io/projected/75f88656-05dd-4670-8fee-421667219118-kube-api-access-qsjtl\") pod \"downloads-6bcc868b7-tpm5c\" (UID: \"75f88656-05dd-4670-8fee-421667219118\") " pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:26:47.673946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-registry-certificates\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.674195 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-bound-sa-token\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.674195 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6836012d-7ba1-41cd-a9e7-bac6c71b7852-data-volume\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.674195 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.673989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-registry-tls\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.674283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.674237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.674283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.674262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6836012d-7ba1-41cd-a9e7-bac6c71b7852-data-volume\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.676061 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.676033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6836012d-7ba1-41cd-a9e7-bac6c71b7852-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.684882 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.684858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44p2n\" (UniqueName: \"kubernetes.io/projected/6836012d-7ba1-41cd-a9e7-bac6c71b7852-kube-api-access-44p2n\") pod \"insights-runtime-extractor-z8s9x\" (UID: \"6836012d-7ba1-41cd-a9e7-bac6c71b7852\") " pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.774719 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsjtl\" (UniqueName: \"kubernetes.io/projected/75f88656-05dd-4670-8fee-421667219118-kube-api-access-qsjtl\") pod \"downloads-6bcc868b7-tpm5c\" (UID: \"75f88656-05dd-4670-8fee-421667219118\") " pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:26:47.774837 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-registry-certificates\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.774837 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-bound-sa-token\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.774837 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-registry-tls\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.774837 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c497767-0280-48b6-a885-0915f5cc5c12-ca-trust-extracted\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.774837 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-installation-pull-secrets\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775051 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64nhm\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-kube-api-access-64nhm\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775051 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-image-registry-private-configuration\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775051 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.774930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-trusted-ca\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775331 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.775311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c497767-0280-48b6-a885-0915f5cc5c12-ca-trust-extracted\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775648 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.775620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-registry-certificates\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.775850 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.775826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c497767-0280-48b6-a885-0915f5cc5c12-trusted-ca\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.777157 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.777133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-image-registry-private-configuration\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.777256 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.777205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-registry-tls\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.777313 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.777292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c497767-0280-48b6-a885-0915f5cc5c12-installation-pull-secrets\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.783341 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.783311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsjtl\" (UniqueName: \"kubernetes.io/projected/75f88656-05dd-4670-8fee-421667219118-kube-api-access-qsjtl\") pod \"downloads-6bcc868b7-tpm5c\" (UID: \"75f88656-05dd-4670-8fee-421667219118\") " pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:26:47.783983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.783958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64nhm\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-kube-api-access-64nhm\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.784073 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.784035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c497767-0280-48b6-a885-0915f5cc5c12-bound-sa-token\") pod \"image-registry-5d475cf49c-4txv4\" (UID: \"3c497767-0280-48b6-a885-0915f5cc5c12\") " pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.789134 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.789112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z8s9x" Apr 24 14:26:47.842068 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.842020 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:47.864395 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.864354 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:26:47.907769 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.907697 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z8s9x"] Apr 24 14:26:47.911510 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:47.911480 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6836012d_7ba1_41cd_a9e7_bac6c71b7852.slice/crio-42fea42382282e9d54b5a35d45373c4688465cba3b10f1a6286fa2db2ca21f7f WatchSource:0}: Error finding container 42fea42382282e9d54b5a35d45373c4688465cba3b10f1a6286fa2db2ca21f7f: Status 404 returned error can't find the container with id 42fea42382282e9d54b5a35d45373c4688465cba3b10f1a6286fa2db2ca21f7f Apr 24 14:26:47.975752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.975709 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d475cf49c-4txv4"] Apr 24 14:26:47.978018 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:47.977990 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c497767_0280_48b6_a885_0915f5cc5c12.slice/crio-acbaeea86d0d81170e6cfca80245ef44c7e485a0e91d93b1b47f2d05415483b0 WatchSource:0}: Error finding container acbaeea86d0d81170e6cfca80245ef44c7e485a0e91d93b1b47f2d05415483b0: Status 404 returned error can't find the container with id acbaeea86d0d81170e6cfca80245ef44c7e485a0e91d93b1b47f2d05415483b0 Apr 24 14:26:47.987655 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:47.987630 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tpm5c"] Apr 24 14:26:47.991518 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:47.991492 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f88656_05dd_4670_8fee_421667219118.slice/crio-5a8ec9cb463252b0c3aae23bb1f66c7c4f3a70dd3cf8e6a30a68ba7909ceadd8 WatchSource:0}: Error finding container 5a8ec9cb463252b0c3aae23bb1f66c7c4f3a70dd3cf8e6a30a68ba7909ceadd8: Status 404 returned error can't find the container with id 5a8ec9cb463252b0c3aae23bb1f66c7c4f3a70dd3cf8e6a30a68ba7909ceadd8 Apr 24 14:26:48.803312 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.803280 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" event={"ID":"3c497767-0280-48b6-a885-0915f5cc5c12","Type":"ContainerStarted","Data":"8768419968f92b563a5ae44926d47949c4ac2b0c3fa4d51a7444d17c2bba37c2"} Apr 24 14:26:48.803694 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.803323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" event={"ID":"3c497767-0280-48b6-a885-0915f5cc5c12","Type":"ContainerStarted","Data":"acbaeea86d0d81170e6cfca80245ef44c7e485a0e91d93b1b47f2d05415483b0"} Apr 24 14:26:48.803694 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.803384 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:26:48.809304 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.809276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8s9x" event={"ID":"6836012d-7ba1-41cd-a9e7-bac6c71b7852","Type":"ContainerStarted","Data":"bfcb9a383282ce2b28554d2f7d977c1d106dfd5078ff6b2bcc35ace2de0c1896"} Apr 24 14:26:48.809426 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.809310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8s9x" event={"ID":"6836012d-7ba1-41cd-a9e7-bac6c71b7852","Type":"ContainerStarted","Data":"1ecd83810d9f7aeb1d1310967844394b2749507ec5f41c9638db3d525ade80b0"} Apr 24 14:26:48.809426 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.809323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8s9x" event={"ID":"6836012d-7ba1-41cd-a9e7-bac6c71b7852","Type":"ContainerStarted","Data":"42fea42382282e9d54b5a35d45373c4688465cba3b10f1a6286fa2db2ca21f7f"} Apr 24 14:26:48.810290 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.810261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tpm5c" event={"ID":"75f88656-05dd-4670-8fee-421667219118","Type":"ContainerStarted","Data":"5a8ec9cb463252b0c3aae23bb1f66c7c4f3a70dd3cf8e6a30a68ba7909ceadd8"} Apr 24 14:26:48.823844 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:48.823803 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" podStartSLOduration=1.823790377 podStartE2EDuration="1.823790377s" podCreationTimestamp="2026-04-24 14:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:48.82246754 +0000 UTC m=+155.153745305" watchObservedRunningTime="2026-04-24 14:26:48.823790377 +0000 UTC m=+155.155068140" Apr 24 14:26:49.100262 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:49.100217 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-z29nv" podUID="92dbdc96-9b06-45f2-9e4e-317abc345922" Apr 24 14:26:49.114387 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:49.114350 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5gpks" podUID="aa089429-959e-4b12-bddb-1d6d0ce963c9" Apr 24 14:26:49.241966 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:26:49.241925 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dkhdd" podUID="7ed1658e-98f8-4fe9-bb01-60b235015d4b" Apr 24 14:26:49.814056 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:49.813967 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:26:49.814056 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:49.813986 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z29nv" Apr 24 14:26:50.818504 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:50.818460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z8s9x" event={"ID":"6836012d-7ba1-41cd-a9e7-bac6c71b7852","Type":"ContainerStarted","Data":"33982aee4ed28152a54cf3dc630b34cbf7e489d2259ec04651b6222d1b7fc54b"} Apr 24 14:26:50.835183 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:50.835137 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-z8s9x" podStartSLOduration=1.5855457720000001 podStartE2EDuration="3.835122883s" podCreationTimestamp="2026-04-24 14:26:47 +0000 UTC" firstStartedPulling="2026-04-24 14:26:47.979579745 +0000 UTC m=+154.310857490" lastFinishedPulling="2026-04-24 14:26:50.229156854 +0000 UTC m=+156.560434601" observedRunningTime="2026-04-24 14:26:50.834460682 +0000 UTC m=+157.165738445" watchObservedRunningTime="2026-04-24 14:26:50.835122883 +0000 UTC m=+157.166400643" Apr 24 14:26:52.390035 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.390001 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z"] Apr 24 14:26:52.393147 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.393128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:52.395374 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.395352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 14:26:52.395502 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.395372 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-m86bw\"" Apr 24 14:26:52.400238 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.400215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z"] Apr 24 14:26:52.517111 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.517072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f598951f-f7af-4599-aef7-cf715800fb86-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mtp5z\" (UID: \"f598951f-f7af-4599-aef7-cf715800fb86\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:52.618179 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.618145 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f598951f-f7af-4599-aef7-cf715800fb86-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mtp5z\" (UID: \"f598951f-f7af-4599-aef7-cf715800fb86\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:52.620804 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.620777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f598951f-f7af-4599-aef7-cf715800fb86-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mtp5z\" (UID: \"f598951f-f7af-4599-aef7-cf715800fb86\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:52.703716 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.703633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:52.841012 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:52.840983 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z"] Apr 24 14:26:52.844203 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:52.844173 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf598951f_f7af_4599_aef7_cf715800fb86.slice/crio-b2775535f1b1839bf41f69d1c2876b0197e78ac9c6f28c95acd1b9d166d55ecc WatchSource:0}: Error finding container b2775535f1b1839bf41f69d1c2876b0197e78ac9c6f28c95acd1b9d166d55ecc: Status 404 returned error can't find the container with id b2775535f1b1839bf41f69d1c2876b0197e78ac9c6f28c95acd1b9d166d55ecc Apr 24 14:26:53.831255 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:53.831215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" event={"ID":"f598951f-f7af-4599-aef7-cf715800fb86","Type":"ContainerStarted","Data":"b2775535f1b1839bf41f69d1c2876b0197e78ac9c6f28c95acd1b9d166d55ecc"} Apr 24 14:26:54.032349 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.032126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:26:54.032349 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.032217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:26:54.034497 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.034473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dbdc96-9b06-45f2-9e4e-317abc345922-metrics-tls\") pod \"dns-default-z29nv\" (UID: \"92dbdc96-9b06-45f2-9e4e-317abc345922\") " pod="openshift-dns/dns-default-z29nv" Apr 24 14:26:54.034799 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.034778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa089429-959e-4b12-bddb-1d6d0ce963c9-cert\") pod \"ingress-canary-5gpks\" (UID: \"aa089429-959e-4b12-bddb-1d6d0ce963c9\") " pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:26:54.316835 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.316801 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v25p\"" Apr 24 14:26:54.317026 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.316904 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6p8rb\"" Apr 24 14:26:54.325502 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.325465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z29nv" Apr 24 14:26:54.325502 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.325483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5gpks" Apr 24 14:26:54.461157 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.461125 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5gpks"] Apr 24 14:26:54.466044 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:54.466013 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa089429_959e_4b12_bddb_1d6d0ce963c9.slice/crio-480fa117c895f5ec53195bd0e390a14822a767d7015e994ac7586c9077c92bf0 WatchSource:0}: Error finding container 480fa117c895f5ec53195bd0e390a14822a767d7015e994ac7586c9077c92bf0: Status 404 returned error can't find the container with id 480fa117c895f5ec53195bd0e390a14822a767d7015e994ac7586c9077c92bf0 Apr 24 14:26:54.481322 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.481293 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z29nv"] Apr 24 14:26:54.484298 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:54.484272 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92dbdc96_9b06_45f2_9e4e_317abc345922.slice/crio-dd2f80513508a3e375a1753e6b4e322990b91837eaf97a776c667147bc954ca1 WatchSource:0}: Error finding container dd2f80513508a3e375a1753e6b4e322990b91837eaf97a776c667147bc954ca1: Status 404 returned error can't find the container with id dd2f80513508a3e375a1753e6b4e322990b91837eaf97a776c667147bc954ca1 Apr 24 14:26:54.835758 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.835715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z29nv" event={"ID":"92dbdc96-9b06-45f2-9e4e-317abc345922","Type":"ContainerStarted","Data":"dd2f80513508a3e375a1753e6b4e322990b91837eaf97a776c667147bc954ca1"} Apr 24 14:26:54.837163 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.837133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" event={"ID":"f598951f-f7af-4599-aef7-cf715800fb86","Type":"ContainerStarted","Data":"2e62a88010dea323baa6eb6c0e9c410ebdc65e6e38ae72e45b8a199d47319670"} Apr 24 14:26:54.837440 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.837422 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:54.838520 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.838495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5gpks" event={"ID":"aa089429-959e-4b12-bddb-1d6d0ce963c9","Type":"ContainerStarted","Data":"480fa117c895f5ec53195bd0e390a14822a767d7015e994ac7586c9077c92bf0"} Apr 24 14:26:54.842928 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.842908 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" Apr 24 14:26:54.852170 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:54.852121 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mtp5z" podStartSLOduration=1.774744118 podStartE2EDuration="2.852085895s" podCreationTimestamp="2026-04-24 14:26:52 +0000 UTC" firstStartedPulling="2026-04-24 14:26:52.846455204 +0000 UTC m=+159.177732947" lastFinishedPulling="2026-04-24 14:26:53.923796976 +0000 UTC m=+160.255074724" observedRunningTime="2026-04-24 14:26:54.85108991 +0000 UTC m=+161.182367675" watchObservedRunningTime="2026-04-24 14:26:54.852085895 +0000 UTC m=+161.183363662" Apr 24 14:26:55.449310 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.448126 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fgjh4"] Apr 24 14:26:55.451752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.451724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.455376 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455347 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:26:55.455491 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 14:26:55.455629 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455614 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 14:26:55.455669 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455642 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-d2xtm\"" Apr 24 14:26:55.455775 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455757 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:26:55.455828 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.455793 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:26:55.458295 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.458267 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fgjh4"] Apr 24 14:26:55.545835 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.545602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.545835 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.545652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcvc\" (UniqueName: \"kubernetes.io/projected/22aa473c-cf15-4f73-b224-47edf2e4211a-kube-api-access-wlcvc\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.545835 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.545738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.545835 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.545774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22aa473c-cf15-4f73-b224-47edf2e4211a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.646236 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.646197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.646398 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.646252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcvc\" (UniqueName: \"kubernetes.io/projected/22aa473c-cf15-4f73-b224-47edf2e4211a-kube-api-access-wlcvc\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.646398 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.646314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.646398 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.646350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22aa473c-cf15-4f73-b224-47edf2e4211a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.647141 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.647113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22aa473c-cf15-4f73-b224-47edf2e4211a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.649960 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.649913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.651053 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.651022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/22aa473c-cf15-4f73-b224-47edf2e4211a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.654815 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.654792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcvc\" (UniqueName: \"kubernetes.io/projected/22aa473c-cf15-4f73-b224-47edf2e4211a-kube-api-access-wlcvc\") pod \"prometheus-operator-5676c8c784-fgjh4\" (UID: \"22aa473c-cf15-4f73-b224-47edf2e4211a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:55.765392 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:55.765352 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" Apr 24 14:26:56.661614 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.661592 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fgjh4"] Apr 24 14:26:56.667226 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:26:56.666343 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22aa473c_cf15_4f73_b224_47edf2e4211a.slice/crio-a83750a6c2f2d21e048706cb890f7dfd974eef5be7e5fbc1eaf0782f037923b8 WatchSource:0}: Error finding container a83750a6c2f2d21e048706cb890f7dfd974eef5be7e5fbc1eaf0782f037923b8: Status 404 returned error can't find the container with id a83750a6c2f2d21e048706cb890f7dfd974eef5be7e5fbc1eaf0782f037923b8 Apr 24 14:26:56.845897 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.845863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5gpks" event={"ID":"aa089429-959e-4b12-bddb-1d6d0ce963c9","Type":"ContainerStarted","Data":"65556e5afaac811b9f349e9a94db17712c1653bd55e4deadf47939e9213a6aed"} Apr 24 14:26:56.847339 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.847315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" event={"ID":"22aa473c-cf15-4f73-b224-47edf2e4211a","Type":"ContainerStarted","Data":"a83750a6c2f2d21e048706cb890f7dfd974eef5be7e5fbc1eaf0782f037923b8"} Apr 24 14:26:56.848935 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.848914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z29nv" event={"ID":"92dbdc96-9b06-45f2-9e4e-317abc345922","Type":"ContainerStarted","Data":"3f15674ce7b66266a57b2b9a986d355d8ba86dd034f63fb227fa83b553c1e960"} Apr 24 14:26:56.848997 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.848945 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z29nv" event={"ID":"92dbdc96-9b06-45f2-9e4e-317abc345922","Type":"ContainerStarted","Data":"2222e60894f2d2eb28d136102e3b98c90a15c1740e077ba1ae98fb7e2054f4e4"} Apr 24 14:26:56.860596 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.860544 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5gpks" podStartSLOduration=128.803488309 podStartE2EDuration="2m10.86052952s" podCreationTimestamp="2026-04-24 14:24:46 +0000 UTC" firstStartedPulling="2026-04-24 14:26:54.467823105 +0000 UTC m=+160.799100851" lastFinishedPulling="2026-04-24 14:26:56.524864319 +0000 UTC m=+162.856142062" observedRunningTime="2026-04-24 14:26:56.859577089 +0000 UTC m=+163.190854865" watchObservedRunningTime="2026-04-24 14:26:56.86052952 +0000 UTC m=+163.191807306" Apr 24 14:26:56.875403 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:56.875358 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z29nv" podStartSLOduration=128.841325503 podStartE2EDuration="2m10.875341347s" podCreationTimestamp="2026-04-24 14:24:46 +0000 UTC" firstStartedPulling="2026-04-24 14:26:54.486535644 +0000 UTC m=+160.817813392" lastFinishedPulling="2026-04-24 14:26:56.520551489 +0000 UTC m=+162.851829236" observedRunningTime="2026-04-24 14:26:56.874691233 +0000 UTC m=+163.205968998" watchObservedRunningTime="2026-04-24 14:26:56.875341347 +0000 UTC m=+163.206619114" Apr 24 14:26:57.473276 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:57.473246 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:26:57.851605 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:26:57.851575 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-z29nv" Apr 24 14:27:00.367741 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.367710 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:00.370968 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.370946 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.373890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.373803 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:27:00.373890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.373837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:27:00.373890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.373850 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-z8khd\"" Apr 24 14:27:00.373890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.373865 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:27:00.374182 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.373850 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:27:00.374182 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.374176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:27:00.377946 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.377924 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:00.378932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.378909 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 14:27:00.384082 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384194 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384255 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzx5\" (UniqueName: \"kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384255 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384235 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384353 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384272 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384353 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.384443 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.384364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485498 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485498 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzx5\" (UniqueName: \"kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.485732 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.485699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.486525 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.486457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.487035 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.487006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.487188 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.487153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.487252 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.487221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.488804 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.488779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.488899 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.488810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.493307 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.493287 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzx5\" (UniqueName: \"kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5\") pod \"console-77d59944c9-wxflm\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:00.682844 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:00.682754 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:02.229617 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:02.229584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:27:03.431221 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.431194 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:03.438210 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:27:03.438184 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01608b64_ea1f_41a6_b298_ac6f70f4c55e.slice/crio-de4a68e352a9ce2e846cc8f8cea7073c2337862aef5878e0413e128643e225e5 WatchSource:0}: Error finding container de4a68e352a9ce2e846cc8f8cea7073c2337862aef5878e0413e128643e225e5: Status 404 returned error can't find the container with id de4a68e352a9ce2e846cc8f8cea7073c2337862aef5878e0413e128643e225e5 Apr 24 14:27:03.870840 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.870732 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d59944c9-wxflm" event={"ID":"01608b64-ea1f-41a6-b298-ac6f70f4c55e","Type":"ContainerStarted","Data":"de4a68e352a9ce2e846cc8f8cea7073c2337862aef5878e0413e128643e225e5"} Apr 24 14:27:03.873085 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.872981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tpm5c" event={"ID":"75f88656-05dd-4670-8fee-421667219118","Type":"ContainerStarted","Data":"288c9a8d48ac1ea18e2870b9f2f9a02d549e82cd5ecf0ae02b27de8028db718d"} Apr 24 14:27:03.874224 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.874008 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:27:03.884642 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.884598 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tpm5c" Apr 24 14:27:03.891218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:03.891120 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tpm5c" podStartSLOduration=1.480385601 podStartE2EDuration="16.891087586s" podCreationTimestamp="2026-04-24 14:26:47 +0000 UTC" firstStartedPulling="2026-04-24 14:26:47.993369659 +0000 UTC m=+154.324647402" lastFinishedPulling="2026-04-24 14:27:03.404071641 +0000 UTC m=+169.735349387" observedRunningTime="2026-04-24 14:27:03.890084491 +0000 UTC m=+170.221362255" watchObservedRunningTime="2026-04-24 14:27:03.891087586 +0000 UTC m=+170.222365375" Apr 24 14:27:04.882537 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:04.882497 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" event={"ID":"22aa473c-cf15-4f73-b224-47edf2e4211a","Type":"ContainerStarted","Data":"32ea422ce8502b4cde18da90108a5f200affb9bc49c74870e157fc450d30089f"} Apr 24 14:27:04.882982 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:04.882545 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" event={"ID":"22aa473c-cf15-4f73-b224-47edf2e4211a","Type":"ContainerStarted","Data":"0734bce5a29f16dc6061effb5fad1287b75849d5778be43be3278d5606c15c63"} Apr 24 14:27:04.900902 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:04.900840 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-fgjh4" podStartSLOduration=2.102934958 podStartE2EDuration="9.900820791s" podCreationTimestamp="2026-04-24 14:26:55 +0000 UTC" firstStartedPulling="2026-04-24 14:26:56.667763684 +0000 UTC m=+162.999041438" lastFinishedPulling="2026-04-24 14:27:04.465649526 +0000 UTC m=+170.796927271" observedRunningTime="2026-04-24 14:27:04.90046365 +0000 UTC m=+171.231741441" watchObservedRunningTime="2026-04-24 14:27:04.900820791 +0000 UTC m=+171.232098555" Apr 24 14:27:06.825924 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.825887 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zvj56"] Apr 24 14:27:06.848407 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.848382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.851459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.851261 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lpfj8\"" Apr 24 14:27:06.851459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.851348 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:27:06.851459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.851360 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:27:06.851459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.851414 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:27:06.945983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.945930 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-root\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.945983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.945979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-textfile\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946258 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946258 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-wtmp\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946258 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jml\" (UniqueName: \"kubernetes.io/projected/cd857084-4b07-4387-b3f1-e5478ef38bdc-kube-api-access-76jml\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946258 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-sys\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946407 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946407 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-metrics-client-ca\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:06.946492 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:06.946424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-tls\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047176 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-wtmp\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047176 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76jml\" (UniqueName: \"kubernetes.io/projected/cd857084-4b07-4387-b3f1-e5478ef38bdc-kube-api-access-76jml\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047387 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-sys\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047387 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047337 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-wtmp\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047496 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-sys\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047496 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047572 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-metrics-client-ca\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047572 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-tls\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047659 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-root\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047659 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-textfile\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047659 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.047781 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.047680 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cd857084-4b07-4387-b3f1-e5478ef38bdc-root\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.048174 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.048129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-textfile\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.048421 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.048399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-metrics-client-ca\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.048468 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.048449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.049969 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.049938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.050196 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.050174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cd857084-4b07-4387-b3f1-e5478ef38bdc-node-exporter-tls\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.055027 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.055006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jml\" (UniqueName: \"kubernetes.io/projected/cd857084-4b07-4387-b3f1-e5478ef38bdc-kube-api-access-76jml\") pod \"node-exporter-zvj56\" (UID: \"cd857084-4b07-4387-b3f1-e5478ef38bdc\") " pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.159256 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.159231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvj56" Apr 24 14:27:07.170485 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:27:07.170454 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd857084_4b07_4387_b3f1_e5478ef38bdc.slice/crio-24dc227580edd06d6a3a8028af7c9fde23545b09054c8b85beb16ef43e7f8718 WatchSource:0}: Error finding container 24dc227580edd06d6a3a8028af7c9fde23545b09054c8b85beb16ef43e7f8718: Status 404 returned error can't find the container with id 24dc227580edd06d6a3a8028af7c9fde23545b09054c8b85beb16ef43e7f8718 Apr 24 14:27:07.856061 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.855930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z29nv" Apr 24 14:27:07.895827 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.895555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d59944c9-wxflm" event={"ID":"01608b64-ea1f-41a6-b298-ac6f70f4c55e","Type":"ContainerStarted","Data":"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47"} Apr 24 14:27:07.897363 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.897302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvj56" event={"ID":"cd857084-4b07-4387-b3f1-e5478ef38bdc","Type":"ContainerStarted","Data":"24dc227580edd06d6a3a8028af7c9fde23545b09054c8b85beb16ef43e7f8718"} Apr 24 14:27:07.918873 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:07.918805 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77d59944c9-wxflm" podStartSLOduration=4.241391132 podStartE2EDuration="7.918786247s" podCreationTimestamp="2026-04-24 14:27:00 +0000 UTC" firstStartedPulling="2026-04-24 14:27:03.440013276 +0000 UTC m=+169.771291018" lastFinishedPulling="2026-04-24 14:27:07.117408378 +0000 UTC m=+173.448686133" observedRunningTime="2026-04-24 14:27:07.918426178 +0000 UTC m=+174.249703943" watchObservedRunningTime="2026-04-24 14:27:07.918786247 +0000 UTC m=+174.250064011" Apr 24 14:27:08.902294 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:08.902214 2571 generic.go:358] "Generic (PLEG): container finished" podID="cd857084-4b07-4387-b3f1-e5478ef38bdc" containerID="81421fdcdde4cf8ff1a971e25b894dac42c5c373b0d142400964186cd9bfcc43" exitCode=0 Apr 24 14:27:08.902726 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:08.902305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvj56" event={"ID":"cd857084-4b07-4387-b3f1-e5478ef38bdc","Type":"ContainerDied","Data":"81421fdcdde4cf8ff1a971e25b894dac42c5c373b0d142400964186cd9bfcc43"} Apr 24 14:27:09.818651 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.818623 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d475cf49c-4txv4" Apr 24 14:27:09.831416 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.831385 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6bf7658d46-wp8rp"] Apr 24 14:27:09.851883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.851856 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bf7658d46-wp8rp"] Apr 24 14:27:09.852079 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.852039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.854609 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.854580 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 14:27:09.854719 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.854613 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-4nnp4\"" Apr 24 14:27:09.854846 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.854828 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 14:27:09.855217 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.855181 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8c23it90ftrfv\"" Apr 24 14:27:09.855736 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.855659 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 14:27:09.856086 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.856062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 14:27:09.856371 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.856354 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 14:27:09.907367 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.907326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvj56" event={"ID":"cd857084-4b07-4387-b3f1-e5478ef38bdc","Type":"ContainerStarted","Data":"d4ea0f428592591e5eb3b3d7708f4a8e6a00ffa550f0f7ad8a49ac298737daf6"} Apr 24 14:27:09.907367 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.907368 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvj56" event={"ID":"cd857084-4b07-4387-b3f1-e5478ef38bdc","Type":"ContainerStarted","Data":"fc5983213b2cb552473074f604cf7d04f9e8f12075ca62f47400843e88c0239c"} Apr 24 14:27:09.932423 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.932371 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zvj56" podStartSLOduration=2.5073981549999997 podStartE2EDuration="3.932351571s" podCreationTimestamp="2026-04-24 14:27:06 +0000 UTC" firstStartedPulling="2026-04-24 14:27:07.172577663 +0000 UTC m=+173.503855405" lastFinishedPulling="2026-04-24 14:27:08.597531067 +0000 UTC m=+174.928808821" observedRunningTime="2026-04-24 14:27:09.930455759 +0000 UTC m=+176.261733539" watchObservedRunningTime="2026-04-24 14:27:09.932351571 +0000 UTC m=+176.263629336" Apr 24 14:27:09.976446 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976635 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976466 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976635 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8crp\" (UniqueName: \"kubernetes.io/projected/dde381ac-d51e-4fed-80be-b149847b7866-kube-api-access-j8crp\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976635 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976797 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-grpc-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976797 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dde381ac-d51e-4fed-80be-b149847b7866-metrics-client-ca\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:09.976950 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:09.976899 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078228 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-grpc-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dde381ac-d51e-4fed-80be-b149847b7866-metrics-client-ca\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078425 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.078725 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.078436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8crp\" (UniqueName: \"kubernetes.io/projected/dde381ac-d51e-4fed-80be-b149847b7866-kube-api-access-j8crp\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.079400 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.079347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dde381ac-d51e-4fed-80be-b149847b7866-metrics-client-ca\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.081190 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.081163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.081313 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.081287 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.081396 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.081370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.081607 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.081582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.081940 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.081912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-grpc-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.082151 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.082133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dde381ac-d51e-4fed-80be-b149847b7866-secret-thanos-querier-tls\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.086306 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.086290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8crp\" (UniqueName: \"kubernetes.io/projected/dde381ac-d51e-4fed-80be-b149847b7866-kube-api-access-j8crp\") pod \"thanos-querier-6bf7658d46-wp8rp\" (UID: \"dde381ac-d51e-4fed-80be-b149847b7866\") " pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.164308 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.164272 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:10.301691 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.301654 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bf7658d46-wp8rp"] Apr 24 14:27:10.305293 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:27:10.305253 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde381ac_d51e_4fed_80be_b149847b7866.slice/crio-b141ef808718e3af3a5165e7e353acf2e06e5dfedf1da9a856af3fd83fc102f2 WatchSource:0}: Error finding container b141ef808718e3af3a5165e7e353acf2e06e5dfedf1da9a856af3fd83fc102f2: Status 404 returned error can't find the container with id b141ef808718e3af3a5165e7e353acf2e06e5dfedf1da9a856af3fd83fc102f2 Apr 24 14:27:10.682994 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.682957 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:10.683256 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.683020 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:10.688066 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.688044 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:10.913485 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.913442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"b141ef808718e3af3a5165e7e353acf2e06e5dfedf1da9a856af3fd83fc102f2"} Apr 24 14:27:10.918525 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:10.918341 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:12.487437 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.487375 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-77664b688c-txf2p" podUID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" containerName="registry" containerID="cri-o://7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594" gracePeriod=30 Apr 24 14:27:12.888189 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.887668 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:27:12.920886 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.920849 2571 generic.go:358] "Generic (PLEG): container finished" podID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" containerID="7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594" exitCode=0 Apr 24 14:27:12.920886 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.920872 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77664b688c-txf2p" Apr 24 14:27:12.921141 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.920899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77664b688c-txf2p" event={"ID":"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d","Type":"ContainerDied","Data":"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594"} Apr 24 14:27:12.921141 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.920946 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77664b688c-txf2p" event={"ID":"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d","Type":"ContainerDied","Data":"5cd34177334915cb0b03a5c900ca93ad06ae071229cf475282a2157892f1190e"} Apr 24 14:27:12.921141 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.920968 2571 scope.go:117] "RemoveContainer" containerID="7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594" Apr 24 14:27:12.922812 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.922663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"043facd3418a40afe9d8b995a487ec26344144855a159449867fae28d66b0938"} Apr 24 14:27:12.943824 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.943797 2571 scope.go:117] "RemoveContainer" containerID="7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594" Apr 24 14:27:12.944222 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:27:12.944189 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594\": container with ID starting with 7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594 not found: ID does not exist" containerID="7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594" Apr 24 14:27:12.944333 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:12.944231 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594"} err="failed to get container status \"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594\": rpc error: code = NotFound desc = could not find container \"7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594\": container with ID starting with 7e45a5c7e60ffee9037d2ecca5d98324794558bd95bc9714530fa08fa0044594 not found: ID does not exist" Apr 24 14:27:13.007401 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007370 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007425 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007458 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007582 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007519 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007582 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007545 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007582 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007608 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.007707 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.007648 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gwhj\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj\") pod \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\" (UID: \"a41a9f0b-b6ca-49c0-9009-04ff82be6d5d\") " Apr 24 14:27:13.008367 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.008068 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:13.008367 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.008121 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:13.010428 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.010401 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:13.010573 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.010497 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj" (OuterVolumeSpecName: "kube-api-access-9gwhj") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "kube-api-access-9gwhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:13.010726 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.010700 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:13.010726 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.010710 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:13.010850 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.010755 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:13.017939 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.017905 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" (UID: "a41a9f0b-b6ca-49c0-9009-04ff82be6d5d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:13.071713 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.071682 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:13.072277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.072201 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" containerName="registry" Apr 24 14:27:13.072277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.072220 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" containerName="registry" Apr 24 14:27:13.072390 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.072283 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" containerName="registry" Apr 24 14:27:13.104291 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.103639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:13.104291 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.103825 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.106752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.106590 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a9rsfnnf54gb7\"" Apr 24 14:27:13.106752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.106624 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:27:13.106752 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.106702 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fkxcl\"" Apr 24 14:27:13.107822 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.107256 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:27:13.107822 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.107381 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:27:13.107822 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.107394 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:27:13.107822 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.107522 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:27:13.107822 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.107712 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:27:13.108459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108438 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-image-registry-private-configuration\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108552 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108464 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-installation-pull-secrets\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108552 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108520 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gwhj\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-kube-api-access-9gwhj\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108552 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108535 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-bound-sa-token\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108552 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108550 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-ca-trust-extracted\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108763 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108563 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-certificates\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108763 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108579 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-registry-tls\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.108763 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.108616 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d-trusted-ca\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:13.109219 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.109034 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:27:13.110459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.109984 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:27:13.110459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.110171 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:27:13.110459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.110243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:27:13.110459 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.110359 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:27:13.115117 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.112741 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:27:13.115117 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.114999 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:27:13.209512 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209512 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.209738 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209724 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.209941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdlmj\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-kube-api-access-mdlmj\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210092 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210556 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210556 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.210556 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.210211 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.246130 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.246094 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:27:13.251583 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.251555 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-77664b688c-txf2p"] Apr 24 14:27:13.310636 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdlmj\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-kube-api-access-mdlmj\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310636 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310636 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.310916 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.311406 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.311406 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.310965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.312008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.311609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.312700 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.312675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.312887 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.312867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.313005 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.312989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.313149 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.313133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.313317 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.313301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.313715 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.313696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.314288 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.314263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.314384 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.314361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.314445 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.314393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.315066 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.315041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.315428 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.315403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.315821 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.315756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.316389 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.316331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.316948 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.316925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2654f68e-fd75-44a5-9479-64d02268d0e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.317039 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.317014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.317218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.317199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.317292 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.317262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.317581 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.317557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.318278 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.318249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.319210 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.319168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2654f68e-fd75-44a5-9479-64d02268d0e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.319749 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.319707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdlmj\" (UniqueName: \"kubernetes.io/projected/2654f68e-fd75-44a5-9479-64d02268d0e2-kube-api-access-mdlmj\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.320755 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.320735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2654f68e-fd75-44a5-9479-64d02268d0e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2654f68e-fd75-44a5-9479-64d02268d0e2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.419233 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.419173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:13.573844 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.571483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:13.574668 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:27:13.574638 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2654f68e_fd75_44a5_9479_64d02268d0e2.slice/crio-614f74fb3b1b9df2d90ede5c02336a30cf56866c084c40189c70fab53821692a WatchSource:0}: Error finding container 614f74fb3b1b9df2d90ede5c02336a30cf56866c084c40189c70fab53821692a: Status 404 returned error can't find the container with id 614f74fb3b1b9df2d90ede5c02336a30cf56866c084c40189c70fab53821692a Apr 24 14:27:13.926734 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.926656 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"614f74fb3b1b9df2d90ede5c02336a30cf56866c084c40189c70fab53821692a"} Apr 24 14:27:13.929142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.929093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"882d4d3db6d3896bbc78e57cd9172e8d097148a72573958190150a24be437fe8"} Apr 24 14:27:13.929142 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:13.929143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"6e01494ed3268ce060ce1f1f5c4afa7a05d2b5fdb65e84d9fdef0a2b2bcfae1a"} Apr 24 14:27:14.229865 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:14.229785 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41a9f0b-b6ca-49c0-9009-04ff82be6d5d" path="/var/lib/kubelet/pods/a41a9f0b-b6ca-49c0-9009-04ff82be6d5d/volumes" Apr 24 14:27:14.935231 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:14.935205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"5da3c7003b95357ff9552c207dcb746c6f7fec8596c5054ae24802fb200c914a"} Apr 24 14:27:14.935231 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:14.935238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"606ac9a8d5cdb0a159b61137eb2ec95719022ddf76e03ef6b557cbb74a3b599d"} Apr 24 14:27:14.935611 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:14.935250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" event={"ID":"dde381ac-d51e-4fed-80be-b149847b7866","Type":"ContainerStarted","Data":"e3c977ee4739ef2563ac222f6e362a6d3b605219a66824ecba3e0efb11e27c7f"} Apr 24 14:27:15.939199 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:15.939165 2571 generic.go:358] "Generic (PLEG): container finished" podID="2654f68e-fd75-44a5-9479-64d02268d0e2" containerID="12430d65f695d8fa748d63c449477702add6b235ed19928f880008b685c40b79" exitCode=0 Apr 24 14:27:15.939645 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:15.939246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerDied","Data":"12430d65f695d8fa748d63c449477702add6b235ed19928f880008b685c40b79"} Apr 24 14:27:15.939645 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:15.939609 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:15.983087 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:15.983043 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" podStartSLOduration=3.003715481 podStartE2EDuration="6.983030284s" podCreationTimestamp="2026-04-24 14:27:09 +0000 UTC" firstStartedPulling="2026-04-24 14:27:10.307682682 +0000 UTC m=+176.638960437" lastFinishedPulling="2026-04-24 14:27:14.28699749 +0000 UTC m=+180.618275240" observedRunningTime="2026-04-24 14:27:15.982658406 +0000 UTC m=+182.313936169" watchObservedRunningTime="2026-04-24 14:27:15.983030284 +0000 UTC m=+182.314308048" Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956837 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"5b37b84a88d6946fd12959d1fc79c123bad48522a76338764fb13ca3dc1b3259"} Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"051fa07238199fdc08b541661ab00c69713403d04659e00befbcf611794e4a50"} Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"202d49b16d64682c309a34102bd2a0c92bdc6083c152086d437af0413b7b995a"} Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"9e35b7d20bb3b152778445b8297313769b595f4e08001cd242842fb3f2d1964f"} Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"f246a248ef6aabebc5d601f1ce29840a016667debf68a6a6b0604ecad8ec348e"} Apr 24 14:27:19.956927 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.956917 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2654f68e-fd75-44a5-9479-64d02268d0e2","Type":"ContainerStarted","Data":"ec5b53a0383441c3828ed9f05b398eee64e58e8f95b114a80941bda40578f381"} Apr 24 14:27:19.983983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:19.983937 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.576613306 podStartE2EDuration="6.983924417s" podCreationTimestamp="2026-04-24 14:27:13 +0000 UTC" firstStartedPulling="2026-04-24 14:27:13.578310495 +0000 UTC m=+179.909588238" lastFinishedPulling="2026-04-24 14:27:18.985621607 +0000 UTC m=+185.316899349" observedRunningTime="2026-04-24 14:27:19.982228861 +0000 UTC m=+186.313506635" watchObservedRunningTime="2026-04-24 14:27:19.983924417 +0000 UTC m=+186.315202180" Apr 24 14:27:21.948233 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:21.948205 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6bf7658d46-wp8rp" Apr 24 14:27:23.419411 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:23.419377 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:23.444159 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:23.444125 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:47.035647 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:47.035615 2571 generic.go:358] "Generic (PLEG): container finished" podID="dede47ae-02f2-408e-947c-484180d89394" containerID="e6c890c9982ca1393284c400ed73d9090ddca04426f321458c94aef3ec13e1e7" exitCode=0 Apr 24 14:27:47.036054 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:47.035694 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8blnj" event={"ID":"dede47ae-02f2-408e-947c-484180d89394","Type":"ContainerDied","Data":"e6c890c9982ca1393284c400ed73d9090ddca04426f321458c94aef3ec13e1e7"} Apr 24 14:27:47.036054 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:47.036051 2571 scope.go:117] "RemoveContainer" containerID="e6c890c9982ca1393284c400ed73d9090ddca04426f321458c94aef3ec13e1e7" Apr 24 14:27:48.040294 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.040250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8blnj" event={"ID":"dede47ae-02f2-408e-947c-484180d89394","Type":"ContainerStarted","Data":"83741bc4d9859812ba475838ef9b0826db6a3418af7fc6d83bb5c35bba604ec5"} Apr 24 14:27:48.463336 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.463231 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77d59944c9-wxflm" podUID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" containerName="console" containerID="cri-o://6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47" gracePeriod=15 Apr 24 14:27:48.722857 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.722804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77d59944c9-wxflm_01608b64-ea1f-41a6-b298-ac6f70f4c55e/console/0.log" Apr 24 14:27:48.722960 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.722862 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:48.822831 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822800 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.822831 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822829 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822855 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822873 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822914 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.822945 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkzx5\" (UniqueName: \"kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823065 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.823000 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert\") pod \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\" (UID: \"01608b64-ea1f-41a6-b298-ac6f70f4c55e\") " Apr 24 14:27:48.823382 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.823241 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:48.823501 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.823359 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:48.823501 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.823450 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca" (OuterVolumeSpecName: "service-ca") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:48.823583 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.823543 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config" (OuterVolumeSpecName: "console-config") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:48.825277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.825256 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:48.825510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.825482 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:48.825590 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.825513 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5" (OuterVolumeSpecName: "kube-api-access-xkzx5") pod "01608b64-ea1f-41a6-b298-ac6f70f4c55e" (UID: "01608b64-ea1f-41a6-b298-ac6f70f4c55e"). InnerVolumeSpecName "kube-api-access-xkzx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:48.924091 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924058 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-trusted-ca-bundle\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924091 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924086 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkzx5\" (UniqueName: \"kubernetes.io/projected/01608b64-ea1f-41a6-b298-ac6f70f4c55e-kube-api-access-xkzx5\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924121 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-serving-cert\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924134 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-oauth-serving-cert\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924148 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-oauth-config\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924161 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-console-config\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:48.924283 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:48.924170 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01608b64-ea1f-41a6-b298-ac6f70f4c55e-service-ca\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:27:49.044632 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044608 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77d59944c9-wxflm_01608b64-ea1f-41a6-b298-ac6f70f4c55e/console/0.log" Apr 24 14:27:49.044986 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044644 2571 generic.go:358] "Generic (PLEG): container finished" podID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" containerID="6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47" exitCode=2 Apr 24 14:27:49.044986 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044708 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d59944c9-wxflm" Apr 24 14:27:49.044986 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d59944c9-wxflm" event={"ID":"01608b64-ea1f-41a6-b298-ac6f70f4c55e","Type":"ContainerDied","Data":"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47"} Apr 24 14:27:49.044986 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044781 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d59944c9-wxflm" event={"ID":"01608b64-ea1f-41a6-b298-ac6f70f4c55e","Type":"ContainerDied","Data":"de4a68e352a9ce2e846cc8f8cea7073c2337862aef5878e0413e128643e225e5"} Apr 24 14:27:49.044986 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.044798 2571 scope.go:117] "RemoveContainer" containerID="6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47" Apr 24 14:27:49.052967 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.052928 2571 scope.go:117] "RemoveContainer" containerID="6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47" Apr 24 14:27:49.053259 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:27:49.053234 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47\": container with ID starting with 6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47 not found: ID does not exist" containerID="6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47" Apr 24 14:27:49.053355 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.053270 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47"} err="failed to get container status \"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47\": rpc error: code = NotFound desc = could not find container \"6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47\": container with ID starting with 6e0bde41bde46c2eea5c27652f302059c5de3bc0717e6b9b0239aba07e028a47 not found: ID does not exist" Apr 24 14:27:49.064510 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.064491 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:49.068044 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:49.068020 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77d59944c9-wxflm"] Apr 24 14:27:50.228583 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:27:50.228491 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" path="/var/lib/kubelet/pods/01608b64-ea1f-41a6-b298-ac6f70f4c55e/volumes" Apr 24 14:28:13.419934 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:13.419897 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:13.434941 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:13.434917 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:14.137866 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:14.137828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:25.030268 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:25.030233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:28:25.032476 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:25.032458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed1658e-98f8-4fe9-bb01-60b235015d4b-metrics-certs\") pod \"network-metrics-daemon-dkhdd\" (UID: \"7ed1658e-98f8-4fe9-bb01-60b235015d4b\") " pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:28:25.333030 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:25.332951 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nctch\"" Apr 24 14:28:25.341153 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:25.341132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dkhdd" Apr 24 14:28:25.458854 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:25.458829 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dkhdd"] Apr 24 14:28:25.461360 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:28:25.461328 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed1658e_98f8_4fe9_bb01_60b235015d4b.slice/crio-250be49173ce422e19c6461c084200667d42562287f99e7242ee57cb4de68040 WatchSource:0}: Error finding container 250be49173ce422e19c6461c084200667d42562287f99e7242ee57cb4de68040: Status 404 returned error can't find the container with id 250be49173ce422e19c6461c084200667d42562287f99e7242ee57cb4de68040 Apr 24 14:28:26.157763 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:26.157722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkhdd" event={"ID":"7ed1658e-98f8-4fe9-bb01-60b235015d4b","Type":"ContainerStarted","Data":"250be49173ce422e19c6461c084200667d42562287f99e7242ee57cb4de68040"} Apr 24 14:28:27.161895 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:27.161860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkhdd" event={"ID":"7ed1658e-98f8-4fe9-bb01-60b235015d4b","Type":"ContainerStarted","Data":"4a7acf70bad079745528a0e4081eec2458541b8c5b45a15b389abf190ae6abf5"} Apr 24 14:28:27.161895 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:27.161899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dkhdd" event={"ID":"7ed1658e-98f8-4fe9-bb01-60b235015d4b","Type":"ContainerStarted","Data":"a0379b25a317f47d1dca6102f78b5f99c1addb74e7f6f1abf0c1f8bcae88a7ac"} Apr 24 14:28:27.177499 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:28:27.177447 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dkhdd" podStartSLOduration=252.236309447 podStartE2EDuration="4m13.177430423s" podCreationTimestamp="2026-04-24 14:24:14 +0000 UTC" firstStartedPulling="2026-04-24 14:28:25.463638744 +0000 UTC m=+251.794916490" lastFinishedPulling="2026-04-24 14:28:26.404759724 +0000 UTC m=+252.736037466" observedRunningTime="2026-04-24 14:28:27.175770044 +0000 UTC m=+253.507047807" watchObservedRunningTime="2026-04-24 14:28:27.177430423 +0000 UTC m=+253.508708186" Apr 24 14:29:14.111724 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:14.111695 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:29:14.113377 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:14.113353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:29:14.116828 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:14.116793 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:29:26.550447 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.550413 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xvzvc"] Apr 24 14:29:26.554026 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.550721 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" containerName="console" Apr 24 14:29:26.554026 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.550734 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" containerName="console" Apr 24 14:29:26.554026 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.550798 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="01608b64-ea1f-41a6-b298-ac6f70f4c55e" containerName="console" Apr 24 14:29:26.554787 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.554770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.556842 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.556823 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:29:26.561389 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.561041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xvzvc"] Apr 24 14:29:26.708226 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.708195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-kubelet-config\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.708394 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.708245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-dbus\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.708394 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.708326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-original-pull-secret\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.808998 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.808914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-original-pull-secret\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.808998 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.808973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-kubelet-config\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.808998 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.808996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-dbus\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.809229 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.809130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-kubelet-config\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.809229 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.809168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-dbus\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.811206 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.811185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67e8ae98-c8a1-486c-98fc-9b7d3a11c174-original-pull-secret\") pod \"global-pull-secret-syncer-xvzvc\" (UID: \"67e8ae98-c8a1-486c-98fc-9b7d3a11c174\") " pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.864364 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.864334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xvzvc" Apr 24 14:29:26.979512 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.979481 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xvzvc"] Apr 24 14:29:26.980424 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:29:26.980395 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e8ae98_c8a1_486c_98fc_9b7d3a11c174.slice/crio-daf6767aa0e5ea063b399ac15b49a328c3f7c439642d51bb5c183119fa230cd1 WatchSource:0}: Error finding container daf6767aa0e5ea063b399ac15b49a328c3f7c439642d51bb5c183119fa230cd1: Status 404 returned error can't find the container with id daf6767aa0e5ea063b399ac15b49a328c3f7c439642d51bb5c183119fa230cd1 Apr 24 14:29:26.982188 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:26.982151 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:29:27.322411 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:27.322378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xvzvc" event={"ID":"67e8ae98-c8a1-486c-98fc-9b7d3a11c174","Type":"ContainerStarted","Data":"daf6767aa0e5ea063b399ac15b49a328c3f7c439642d51bb5c183119fa230cd1"} Apr 24 14:29:31.337716 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:31.337674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xvzvc" event={"ID":"67e8ae98-c8a1-486c-98fc-9b7d3a11c174","Type":"ContainerStarted","Data":"d083c04f01b817d8bc3b6962849cbbb59724e72c8fe505e780b77719934081a7"} Apr 24 14:29:31.352462 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:29:31.352410 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xvzvc" podStartSLOduration=1.680205882 podStartE2EDuration="5.35239374s" podCreationTimestamp="2026-04-24 14:29:26 +0000 UTC" firstStartedPulling="2026-04-24 14:29:26.982339084 +0000 UTC m=+313.313616843" lastFinishedPulling="2026-04-24 14:29:30.654526955 +0000 UTC m=+316.985804701" observedRunningTime="2026-04-24 14:29:31.350931919 +0000 UTC m=+317.682209693" watchObservedRunningTime="2026-04-24 14:29:31.35239374 +0000 UTC m=+317.683671506" Apr 24 14:31:43.082616 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.082526 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7"] Apr 24 14:31:43.085792 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.085768 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.087697 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.087675 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 14:31:43.087815 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.087720 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:31:43.088270 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.088255 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-txljr\"" Apr 24 14:31:43.098076 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.098047 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7"] Apr 24 14:31:43.111056 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.111026 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfwdr\" (UniqueName: \"kubernetes.io/projected/ccea0364-b805-4174-8095-2979e31415e6-kube-api-access-sfwdr\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.111198 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.111174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccea0364-b805-4174-8095-2979e31415e6-tmp\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.211861 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.211826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccea0364-b805-4174-8095-2979e31415e6-tmp\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.212029 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.211997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfwdr\" (UniqueName: \"kubernetes.io/projected/ccea0364-b805-4174-8095-2979e31415e6-kube-api-access-sfwdr\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.212206 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.212188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccea0364-b805-4174-8095-2979e31415e6-tmp\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.224833 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.224813 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfwdr\" (UniqueName: \"kubernetes.io/projected/ccea0364-b805-4174-8095-2979e31415e6-kube-api-access-sfwdr\") pod \"openshift-lws-operator-bfc7f696d-vznf7\" (UID: \"ccea0364-b805-4174-8095-2979e31415e6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.408119 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.408019 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" Apr 24 14:31:43.522164 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.522134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7"] Apr 24 14:31:43.525851 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:31:43.525824 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccea0364_b805_4174_8095_2979e31415e6.slice/crio-265ca9edb1b7120ea185beaad642b677a80792312cfa374b6ed80767bf7b465b WatchSource:0}: Error finding container 265ca9edb1b7120ea185beaad642b677a80792312cfa374b6ed80767bf7b465b: Status 404 returned error can't find the container with id 265ca9edb1b7120ea185beaad642b677a80792312cfa374b6ed80767bf7b465b Apr 24 14:31:43.706000 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:43.705921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" event={"ID":"ccea0364-b805-4174-8095-2979e31415e6","Type":"ContainerStarted","Data":"265ca9edb1b7120ea185beaad642b677a80792312cfa374b6ed80767bf7b465b"} Apr 24 14:31:45.932239 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.932204 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ql4mw"] Apr 24 14:31:45.935973 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.935944 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:45.937894 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.937873 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-w8k77\"" Apr 24 14:31:45.937996 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.937874 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 14:31:45.938409 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.938396 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 14:31:45.947488 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:45.947465 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ql4mw"] Apr 24 14:31:46.040329 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.040290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vldd\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-kube-api-access-9vldd\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.040511 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.040362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-bound-sa-token\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.141424 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.141388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-bound-sa-token\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.141576 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.141465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vldd\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-kube-api-access-9vldd\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.148667 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.148635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-bound-sa-token\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.148788 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.148746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vldd\" (UniqueName: \"kubernetes.io/projected/7de8c639-2a0d-41b8-8c89-c42e1fe51607-kube-api-access-9vldd\") pod \"cert-manager-79c8d999ff-ql4mw\" (UID: \"7de8c639-2a0d-41b8-8c89-c42e1fe51607\") " pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.246654 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.246634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ql4mw" Apr 24 14:31:46.372008 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.371986 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ql4mw"] Apr 24 14:31:46.374529 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:31:46.374496 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de8c639_2a0d_41b8_8c89_c42e1fe51607.slice/crio-8574a6207e8ce96a6bc0e04009c733c1c02d25b68e1e46ee2d67982921b00e78 WatchSource:0}: Error finding container 8574a6207e8ce96a6bc0e04009c733c1c02d25b68e1e46ee2d67982921b00e78: Status 404 returned error can't find the container with id 8574a6207e8ce96a6bc0e04009c733c1c02d25b68e1e46ee2d67982921b00e78 Apr 24 14:31:46.723121 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.723064 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ql4mw" event={"ID":"7de8c639-2a0d-41b8-8c89-c42e1fe51607","Type":"ContainerStarted","Data":"8574a6207e8ce96a6bc0e04009c733c1c02d25b68e1e46ee2d67982921b00e78"} Apr 24 14:31:46.724518 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.724491 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" event={"ID":"ccea0364-b805-4174-8095-2979e31415e6","Type":"ContainerStarted","Data":"0b8fded7eb12cad81ecb2a42c42d92079fc804b53d4d903f15a83b6f4cb6ab49"} Apr 24 14:31:46.741240 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:46.741186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vznf7" podStartSLOduration=1.075824472 podStartE2EDuration="3.74117331s" podCreationTimestamp="2026-04-24 14:31:43 +0000 UTC" firstStartedPulling="2026-04-24 14:31:43.527337808 +0000 UTC m=+449.858615562" lastFinishedPulling="2026-04-24 14:31:46.192686642 +0000 UTC m=+452.523964400" observedRunningTime="2026-04-24 14:31:46.739556152 +0000 UTC m=+453.070833918" watchObservedRunningTime="2026-04-24 14:31:46.74117331 +0000 UTC m=+453.072451074" Apr 24 14:31:49.736174 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:49.736065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ql4mw" event={"ID":"7de8c639-2a0d-41b8-8c89-c42e1fe51607","Type":"ContainerStarted","Data":"9937e20c176a7af00549144a7faaa5e0430333f67e7deb88b56f550ac18ed0e3"} Apr 24 14:31:49.751706 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:49.751657 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-ql4mw" podStartSLOduration=1.7932619079999998 podStartE2EDuration="4.751642467s" podCreationTimestamp="2026-04-24 14:31:45 +0000 UTC" firstStartedPulling="2026-04-24 14:31:46.376449516 +0000 UTC m=+452.707727259" lastFinishedPulling="2026-04-24 14:31:49.334830069 +0000 UTC m=+455.666107818" observedRunningTime="2026-04-24 14:31:49.7502551 +0000 UTC m=+456.081532864" watchObservedRunningTime="2026-04-24 14:31:49.751642467 +0000 UTC m=+456.082920230" Apr 24 14:31:56.499223 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.499191 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb"] Apr 24 14:31:56.502482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.502458 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.505085 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.505063 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-dkwp9\"" Apr 24 14:31:56.505233 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.505212 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 14:31:56.505311 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.505213 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 14:31:56.505377 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.505361 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 14:31:56.510572 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.510551 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb"] Apr 24 14:31:56.631252 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.631222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.631394 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.631258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b75e0293-0089-469d-a700-637aba0a6951-manager-config\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.631394 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.631290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4j4\" (UniqueName: \"kubernetes.io/projected/b75e0293-0089-469d-a700-637aba0a6951-kube-api-access-9w4j4\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.631394 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.631374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-metrics-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.732357 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.732324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b75e0293-0089-469d-a700-637aba0a6951-manager-config\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.732491 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.732368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4j4\" (UniqueName: \"kubernetes.io/projected/b75e0293-0089-469d-a700-637aba0a6951-kube-api-access-9w4j4\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.732491 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.732403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-metrics-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.732491 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.732439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.732919 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.732899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b75e0293-0089-469d-a700-637aba0a6951-manager-config\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.734869 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.734847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-metrics-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.735011 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.734989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b75e0293-0089-469d-a700-637aba0a6951-cert\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.741082 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.741059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4j4\" (UniqueName: \"kubernetes.io/projected/b75e0293-0089-469d-a700-637aba0a6951-kube-api-access-9w4j4\") pod \"lws-controller-manager-6d967bb649-24ssb\" (UID: \"b75e0293-0089-469d-a700-637aba0a6951\") " pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.811622 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.811589 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:56.940201 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:56.939977 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb"] Apr 24 14:31:56.942410 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:31:56.942380 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75e0293_0089_469d_a700_637aba0a6951.slice/crio-1fd77396a44759b9feae0cce8961d2578d95fdc1ba46d7fa2d07d300d1a4233c WatchSource:0}: Error finding container 1fd77396a44759b9feae0cce8961d2578d95fdc1ba46d7fa2d07d300d1a4233c: Status 404 returned error can't find the container with id 1fd77396a44759b9feae0cce8961d2578d95fdc1ba46d7fa2d07d300d1a4233c Apr 24 14:31:57.762834 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:57.762793 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" event={"ID":"b75e0293-0089-469d-a700-637aba0a6951","Type":"ContainerStarted","Data":"1fd77396a44759b9feae0cce8961d2578d95fdc1ba46d7fa2d07d300d1a4233c"} Apr 24 14:31:58.767804 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:58.767769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" event={"ID":"b75e0293-0089-469d-a700-637aba0a6951","Type":"ContainerStarted","Data":"bd159da32a340d3503a4c9d17ace63ebc1c25bf059b7f95bb565378baaf1db99"} Apr 24 14:31:58.768180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:58.767868 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:31:58.786715 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:31:58.786601 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" podStartSLOduration=1.203660174 podStartE2EDuration="2.786585028s" podCreationTimestamp="2026-04-24 14:31:56 +0000 UTC" firstStartedPulling="2026-04-24 14:31:56.944338322 +0000 UTC m=+463.275616068" lastFinishedPulling="2026-04-24 14:31:58.527263173 +0000 UTC m=+464.858540922" observedRunningTime="2026-04-24 14:31:58.784816983 +0000 UTC m=+465.116094738" watchObservedRunningTime="2026-04-24 14:31:58.786585028 +0000 UTC m=+465.117862847" Apr 24 14:32:09.773180 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:32:09.773152 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6d967bb649-24ssb" Apr 24 14:33:41.779413 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.779380 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c4f96dc7d-vjcjd"] Apr 24 14:33:41.782544 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.782526 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.784875 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.784851 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:33:41.785549 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.785527 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:33:41.785672 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.785549 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:33:41.785672 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.785548 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:33:41.785672 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.785559 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:33:41.785672 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.785549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-z8khd\"" Apr 24 14:33:41.793434 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.793410 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 14:33:41.794482 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.794460 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4f96dc7d-vjcjd"] Apr 24 14:33:41.895184 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-oauth-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895345 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-console-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895345 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-oauth-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895345 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-service-ca\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895447 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895447 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-trusted-ca-bundle\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.895447 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.895418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ppt\" (UniqueName: \"kubernetes.io/projected/98592155-aa17-418e-8b08-a8a8055921ce-kube-api-access-64ppt\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996434 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996600 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-trusted-ca-bundle\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996600 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64ppt\" (UniqueName: \"kubernetes.io/projected/98592155-aa17-418e-8b08-a8a8055921ce-kube-api-access-64ppt\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996669 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-oauth-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996669 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-console-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996764 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-oauth-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.996764 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.996747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-service-ca\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.997403 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.997376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-console-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.997534 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.997376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-oauth-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.997534 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.997446 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-service-ca\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.997534 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.997446 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98592155-aa17-418e-8b08-a8a8055921ce-trusted-ca-bundle\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.998846 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.998817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-serving-cert\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:41.999061 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:41.999040 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98592155-aa17-418e-8b08-a8a8055921ce-console-oauth-config\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:42.007269 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:42.007247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ppt\" (UniqueName: \"kubernetes.io/projected/98592155-aa17-418e-8b08-a8a8055921ce-kube-api-access-64ppt\") pod \"console-6c4f96dc7d-vjcjd\" (UID: \"98592155-aa17-418e-8b08-a8a8055921ce\") " pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:42.091974 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:42.091881 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:42.218612 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:42.218577 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4f96dc7d-vjcjd"] Apr 24 14:33:42.221543 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:33:42.221506 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98592155_aa17_418e_8b08_a8a8055921ce.slice/crio-affd242f2ebf88b6b14fa5062d512e8b86eb119a6468bfc92bfae1c9d0339dad WatchSource:0}: Error finding container affd242f2ebf88b6b14fa5062d512e8b86eb119a6468bfc92bfae1c9d0339dad: Status 404 returned error can't find the container with id affd242f2ebf88b6b14fa5062d512e8b86eb119a6468bfc92bfae1c9d0339dad Apr 24 14:33:43.070033 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:43.069996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4f96dc7d-vjcjd" event={"ID":"98592155-aa17-418e-8b08-a8a8055921ce","Type":"ContainerStarted","Data":"9fbfa9cd9f365b827e213d7989da5442c9d5bf049f95058eb115fe8ecbde4d57"} Apr 24 14:33:43.070453 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:43.070040 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4f96dc7d-vjcjd" event={"ID":"98592155-aa17-418e-8b08-a8a8055921ce","Type":"ContainerStarted","Data":"affd242f2ebf88b6b14fa5062d512e8b86eb119a6468bfc92bfae1c9d0339dad"} Apr 24 14:33:43.088730 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:43.088682 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c4f96dc7d-vjcjd" podStartSLOduration=2.088668384 podStartE2EDuration="2.088668384s" podCreationTimestamp="2026-04-24 14:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:33:43.087990202 +0000 UTC m=+569.419267967" watchObservedRunningTime="2026-04-24 14:33:43.088668384 +0000 UTC m=+569.419946147" Apr 24 14:33:52.092156 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:52.092094 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:52.092551 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:52.092171 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:52.097137 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:52.097108 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:52.100759 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:52.100729 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c4f96dc7d-vjcjd" Apr 24 14:33:54.756655 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.756622 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2"] Apr 24 14:33:54.760994 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.760976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:54.763225 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.763201 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 14:33:54.763350 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.763226 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 14:33:54.763350 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.763227 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7plhs\"" Apr 24 14:33:54.764071 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.764053 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 14:33:54.764071 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.764070 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 14:33:54.767409 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.767386 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2"] Apr 24 14:33:54.905914 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.905878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqlgs\" (UniqueName: \"kubernetes.io/projected/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-kube-api-access-jqlgs\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:54.906125 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.906017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:54.906125 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:54.906084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.007014 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.006919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.007014 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.006980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.007240 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.007051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqlgs\" (UniqueName: \"kubernetes.io/projected/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-kube-api-access-jqlgs\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.008432 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.008397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.010935 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.010905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.016340 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.016307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqlgs\" (UniqueName: \"kubernetes.io/projected/34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c-kube-api-access-jqlgs\") pod \"kuadrant-console-plugin-6c886788f8-ghmw2\" (UID: \"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.070496 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.070469 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" Apr 24 14:33:55.190300 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:55.190164 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2"] Apr 24 14:33:55.192585 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:33:55.192554 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a6b42d_8a3a_4a39_ae2a_5db86c28eb7c.slice/crio-d2a8b74c9b23fe8ee2491d94e6e7bef0e610c1e01c4ae27c8b1493c458128085 WatchSource:0}: Error finding container d2a8b74c9b23fe8ee2491d94e6e7bef0e610c1e01c4ae27c8b1493c458128085: Status 404 returned error can't find the container with id d2a8b74c9b23fe8ee2491d94e6e7bef0e610c1e01c4ae27c8b1493c458128085 Apr 24 14:33:56.111041 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:33:56.111000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" event={"ID":"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c","Type":"ContainerStarted","Data":"d2a8b74c9b23fe8ee2491d94e6e7bef0e610c1e01c4ae27c8b1493c458128085"} Apr 24 14:34:00.126120 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:00.126063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" event={"ID":"34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c","Type":"ContainerStarted","Data":"58dd55e2068f9e51a37d107e2c9d5e3e39b85b94ef838e0593acfa9db8a78ec3"} Apr 24 14:34:00.149435 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:00.144991 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghmw2" podStartSLOduration=1.687159689 podStartE2EDuration="6.144972792s" podCreationTimestamp="2026-04-24 14:33:54 +0000 UTC" firstStartedPulling="2026-04-24 14:33:55.193835921 +0000 UTC m=+581.525113667" lastFinishedPulling="2026-04-24 14:33:59.651649023 +0000 UTC m=+585.982926770" observedRunningTime="2026-04-24 14:34:00.144204392 +0000 UTC m=+586.475482166" watchObservedRunningTime="2026-04-24 14:34:00.144972792 +0000 UTC m=+586.476250557" Apr 24 14:34:14.139588 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:14.139557 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:34:14.140090 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:14.139956 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:34:37.429234 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.429163 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:37.432277 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.432262 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.434332 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.434315 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 14:34:37.439911 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.439856 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:37.519313 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.519283 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:37.565673 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.565648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnr2l\" (UniqueName: \"kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.565818 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.565695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.666731 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.666699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.666882 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.666793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnr2l\" (UniqueName: \"kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.667328 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.667307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.674867 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.674844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnr2l\" (UniqueName: \"kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l\") pod \"limitador-limitador-64c8f475fb-tkn5h\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.743002 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.742948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:37.862172 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.862142 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:37.865623 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:34:37.865594 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc302200d_9985_4e06_8990_c4721107709b.slice/crio-099ae3aaf49483f1a08d68a80ee5a802fcb9dd37837ec09171d83cb1bea8527f WatchSource:0}: Error finding container 099ae3aaf49483f1a08d68a80ee5a802fcb9dd37837ec09171d83cb1bea8527f: Status 404 returned error can't find the container with id 099ae3aaf49483f1a08d68a80ee5a802fcb9dd37837ec09171d83cb1bea8527f Apr 24 14:34:37.867644 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:37.867626 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:34:38.238503 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:38.238464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" event={"ID":"c302200d-9985-4e06-8990-c4721107709b","Type":"ContainerStarted","Data":"099ae3aaf49483f1a08d68a80ee5a802fcb9dd37837ec09171d83cb1bea8527f"} Apr 24 14:34:40.245792 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:40.245751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" event={"ID":"c302200d-9985-4e06-8990-c4721107709b","Type":"ContainerStarted","Data":"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965"} Apr 24 14:34:40.246267 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:40.245803 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:40.260412 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:40.260368 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" podStartSLOduration=1.8644303880000002 podStartE2EDuration="3.260356661s" podCreationTimestamp="2026-04-24 14:34:37 +0000 UTC" firstStartedPulling="2026-04-24 14:34:37.867803682 +0000 UTC m=+624.199081438" lastFinishedPulling="2026-04-24 14:34:39.263729965 +0000 UTC m=+625.595007711" observedRunningTime="2026-04-24 14:34:40.259670512 +0000 UTC m=+626.590948279" watchObservedRunningTime="2026-04-24 14:34:40.260356661 +0000 UTC m=+626.591634425" Apr 24 14:34:51.250383 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:51.250350 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:53.915179 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:53.915143 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:53.915540 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:53.915338 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" podUID="c302200d-9985-4e06-8990-c4721107709b" containerName="limitador" containerID="cri-o://97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965" gracePeriod=30 Apr 24 14:34:54.447939 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.447918 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:54.615050 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.615018 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnr2l\" (UniqueName: \"kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l\") pod \"c302200d-9985-4e06-8990-c4721107709b\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " Apr 24 14:34:54.615229 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.615090 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file\") pod \"c302200d-9985-4e06-8990-c4721107709b\" (UID: \"c302200d-9985-4e06-8990-c4721107709b\") " Apr 24 14:34:54.615472 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.615443 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file" (OuterVolumeSpecName: "config-file") pod "c302200d-9985-4e06-8990-c4721107709b" (UID: "c302200d-9985-4e06-8990-c4721107709b"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:54.617264 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.617240 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l" (OuterVolumeSpecName: "kube-api-access-jnr2l") pod "c302200d-9985-4e06-8990-c4721107709b" (UID: "c302200d-9985-4e06-8990-c4721107709b"). InnerVolumeSpecName "kube-api-access-jnr2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:34:54.715683 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.715658 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c302200d-9985-4e06-8990-c4721107709b-config-file\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:34:54.715683 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:54.715684 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jnr2l\" (UniqueName: \"kubernetes.io/projected/c302200d-9985-4e06-8990-c4721107709b-kube-api-access-jnr2l\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:34:55.295749 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.295715 2571 generic.go:358] "Generic (PLEG): container finished" podID="c302200d-9985-4e06-8990-c4721107709b" containerID="97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965" exitCode=0 Apr 24 14:34:55.296225 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.295779 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" Apr 24 14:34:55.296225 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.295795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" event={"ID":"c302200d-9985-4e06-8990-c4721107709b","Type":"ContainerDied","Data":"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965"} Apr 24 14:34:55.296225 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.295829 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-tkn5h" event={"ID":"c302200d-9985-4e06-8990-c4721107709b","Type":"ContainerDied","Data":"099ae3aaf49483f1a08d68a80ee5a802fcb9dd37837ec09171d83cb1bea8527f"} Apr 24 14:34:55.296225 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.295844 2571 scope.go:117] "RemoveContainer" containerID="97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965" Apr 24 14:34:55.303914 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.303897 2571 scope.go:117] "RemoveContainer" containerID="97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965" Apr 24 14:34:55.304158 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:34:55.304139 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965\": container with ID starting with 97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965 not found: ID does not exist" containerID="97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965" Apr 24 14:34:55.304220 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.304166 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965"} err="failed to get container status \"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965\": rpc error: code = NotFound desc = could not find container \"97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965\": container with ID starting with 97ddd57c6fc5a1e9bc2f8949a879e2006623f95bfde2ce85ea0c77ecdb481965 not found: ID does not exist" Apr 24 14:34:55.315262 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.315238 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:55.318561 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:55.318542 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-tkn5h"] Apr 24 14:34:56.228746 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:34:56.228713 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c302200d-9985-4e06-8990-c4721107709b" path="/var/lib/kubelet/pods/c302200d-9985-4e06-8990-c4721107709b/volumes" Apr 24 14:35:12.663500 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.663462 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr"] Apr 24 14:35:12.663883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.663779 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c302200d-9985-4e06-8990-c4721107709b" containerName="limitador" Apr 24 14:35:12.663883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.663789 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c302200d-9985-4e06-8990-c4721107709b" containerName="limitador" Apr 24 14:35:12.663883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.663842 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c302200d-9985-4e06-8990-c4721107709b" containerName="limitador" Apr 24 14:35:12.666781 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.666764 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.668867 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.668841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 14:35:12.668997 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.668899 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 14:35:12.668997 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.668922 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 14:35:12.668997 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.668841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 14:35:12.669185 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.669170 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-xxlf7\"" Apr 24 14:35:12.675953 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.675934 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr"] Apr 24 14:35:12.749598 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749753 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4m8\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-kube-api-access-zf4m8\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749753 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749753 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749709 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/35817697-9201-4809-ab68-ca05ef564674-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749753 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/35817697-9201-4809-ab68-ca05ef564674-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.749883 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.749800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850772 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850772 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850781 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4m8\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-kube-api-access-zf4m8\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/35817697-9201-4809-ab68-ca05ef564674-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.850983 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.850905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/35817697-9201-4809-ab68-ca05ef564674-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.851202 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.851039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.851684 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.851658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/35817697-9201-4809-ab68-ca05ef564674-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.853212 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.853191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/35817697-9201-4809-ab68-ca05ef564674-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.853328 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.853238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.853328 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.853245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.853435 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.853342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/35817697-9201-4809-ab68-ca05ef564674-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.863999 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.863978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.864911 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.864880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4m8\" (UniqueName: \"kubernetes.io/projected/35817697-9201-4809-ab68-ca05ef564674-kube-api-access-zf4m8\") pod \"istiod-openshift-gateway-55ff986f96-6d7fr\" (UID: \"35817697-9201-4809-ab68-ca05ef564674\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:12.976302 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:12.976227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:13.120749 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:13.120721 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr"] Apr 24 14:35:13.123601 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:35:13.123573 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35817697_9201_4809_ab68_ca05ef564674.slice/crio-ed495454c3781c16c64104a7a50a1035ef0742882db228081b8e813346dccf3b WatchSource:0}: Error finding container ed495454c3781c16c64104a7a50a1035ef0742882db228081b8e813346dccf3b: Status 404 returned error can't find the container with id ed495454c3781c16c64104a7a50a1035ef0742882db228081b8e813346dccf3b Apr 24 14:35:13.349045 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:13.349009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" event={"ID":"35817697-9201-4809-ab68-ca05ef564674","Type":"ContainerStarted","Data":"ed495454c3781c16c64104a7a50a1035ef0742882db228081b8e813346dccf3b"} Apr 24 14:35:15.546727 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:15.546677 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:35:15.547010 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:15.546745 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:35:16.358925 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:16.358888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" event={"ID":"35817697-9201-4809-ab68-ca05ef564674","Type":"ContainerStarted","Data":"a1fa2eefe84d36b8cfa84676c6fc8f78f6708e65b0a49ffe9e20825ef2ec315a"} Apr 24 14:35:16.359089 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:16.359024 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:35:16.360789 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:16.360765 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-6d7fr container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 24 14:35:16.360890 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:16.360832 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" podUID="35817697-9201-4809-ab68-ca05ef564674" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:16.395961 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:16.395906 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" podStartSLOduration=1.975271852 podStartE2EDuration="4.395888045s" podCreationTimestamp="2026-04-24 14:35:12 +0000 UTC" firstStartedPulling="2026-04-24 14:35:13.125821023 +0000 UTC m=+659.457098786" lastFinishedPulling="2026-04-24 14:35:15.546437233 +0000 UTC m=+661.877714979" observedRunningTime="2026-04-24 14:35:16.393682743 +0000 UTC m=+662.724960531" watchObservedRunningTime="2026-04-24 14:35:16.395888045 +0000 UTC m=+662.727165809" Apr 24 14:35:17.362958 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:35:17.362927 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6d7fr" Apr 24 14:39:14.160680 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:39:14.160644 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:39:14.162159 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:39:14.162140 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:40:49.908907 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.908876 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:40:49.912464 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.912442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.915187 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.915161 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 14:40:49.915296 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.915261 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-92j8s\"" Apr 24 14:40:49.915637 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.915616 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 14:40:49.915730 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.915633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 14:40:49.922867 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.922847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:40:49.983416 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6s2\" (UniqueName: \"kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.983549 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.983549 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.983549 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.983549 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983546 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:49.983677 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:49.983571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085058 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.084980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085058 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085254 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6s2\" (UniqueName: \"kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085254 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085254 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085254 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085536 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085597 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.085597 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.085576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.087445 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.087425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.087619 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.087604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.093587 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.093563 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6s2\" (UniqueName: \"kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2\") pod \"precise-prefix-cache-test-kserve-75469f89c9-xmbsz\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.223584 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.223552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:40:50.347279 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.347256 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:40:50.349866 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:40:50.349833 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c0e742c_f090_438a_96f1_39d685b3dec6.slice/crio-d9a4032f76edf8050ad9cd9b1ee3ee9bd9962e11755ed42978ea0714d75fb7ee WatchSource:0}: Error finding container d9a4032f76edf8050ad9cd9b1ee3ee9bd9962e11755ed42978ea0714d75fb7ee: Status 404 returned error can't find the container with id d9a4032f76edf8050ad9cd9b1ee3ee9bd9962e11755ed42978ea0714d75fb7ee Apr 24 14:40:50.352072 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.352050 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:40:50.379696 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:50.379664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerStarted","Data":"d9a4032f76edf8050ad9cd9b1ee3ee9bd9962e11755ed42978ea0714d75fb7ee"} Apr 24 14:40:54.398631 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:54.398590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerStarted","Data":"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb"} Apr 24 14:40:58.417380 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:58.417341 2571 generic.go:358] "Generic (PLEG): container finished" podID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerID="539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb" exitCode=0 Apr 24 14:40:58.417816 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:40:58.417417 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerDied","Data":"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb"} Apr 24 14:41:00.425768 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:00.425730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerStarted","Data":"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818"} Apr 24 14:41:00.443218 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:00.443170 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" podStartSLOduration=2.205454127 podStartE2EDuration="11.443156613s" podCreationTimestamp="2026-04-24 14:40:49 +0000 UTC" firstStartedPulling="2026-04-24 14:40:50.352258809 +0000 UTC m=+996.683536556" lastFinishedPulling="2026-04-24 14:40:59.5899613 +0000 UTC m=+1005.921239042" observedRunningTime="2026-04-24 14:41:00.441860043 +0000 UTC m=+1006.773137819" watchObservedRunningTime="2026-04-24 14:41:00.443156613 +0000 UTC m=+1006.774434378" Apr 24 14:41:10.224386 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:10.224353 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:10.224386 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:10.224395 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:10.237067 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:10.237039 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:10.470471 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:10.470438 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:31.799345 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:31.799309 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:41:31.799891 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:31.799592 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="main" containerID="cri-o://b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818" gracePeriod=30 Apr 24 14:41:32.037271 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.037249 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:32.054323 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054245 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.054323 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054297 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6s2\" (UniqueName: \"kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.054638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054329 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.054638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054377 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.054638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054400 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.054638 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054464 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm\") pod \"3c0e742c-f090-438a-96f1-39d685b3dec6\" (UID: \"3c0e742c-f090-438a-96f1-39d685b3dec6\") " Apr 24 14:41:32.055773 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054640 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home" (OuterVolumeSpecName: "home") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:41:32.055773 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.054753 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-home\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.055773 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.055310 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache" (OuterVolumeSpecName: "model-cache") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:41:32.057535 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.057494 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2" (OuterVolumeSpecName: "kube-api-access-kf6s2") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "kube-api-access-kf6s2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:41:32.057632 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.057540 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:41:32.057632 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.057624 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm" (OuterVolumeSpecName: "dshm") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:41:32.120590 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.120552 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c0e742c-f090-438a-96f1-39d685b3dec6" (UID: "3c0e742c-f090-438a-96f1-39d685b3dec6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:41:32.155810 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.155783 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-dshm\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.155810 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.155812 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-kserve-provision-location\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.155980 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.155822 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kf6s2\" (UniqueName: \"kubernetes.io/projected/3c0e742c-f090-438a-96f1-39d685b3dec6-kube-api-access-kf6s2\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.155980 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.155831 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e742c-f090-438a-96f1-39d685b3dec6-tls-certs\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.155980 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.155841 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e742c-f090-438a-96f1-39d685b3dec6-model-cache\") on node \"ip-10-0-138-116.ec2.internal\" DevicePath \"\"" Apr 24 14:41:32.532696 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.532656 2571 generic.go:358] "Generic (PLEG): container finished" podID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerID="b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818" exitCode=0 Apr 24 14:41:32.532876 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.532737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerDied","Data":"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818"} Apr 24 14:41:32.532876 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.532757 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" Apr 24 14:41:32.532876 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.532779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz" event={"ID":"3c0e742c-f090-438a-96f1-39d685b3dec6","Type":"ContainerDied","Data":"d9a4032f76edf8050ad9cd9b1ee3ee9bd9962e11755ed42978ea0714d75fb7ee"} Apr 24 14:41:32.532876 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.532795 2571 scope.go:117] "RemoveContainer" containerID="b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818" Apr 24 14:41:32.540823 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.540803 2571 scope.go:117] "RemoveContainer" containerID="539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb" Apr 24 14:41:32.546952 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.546931 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:41:32.552116 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.552082 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-75469f89c9-xmbsz"] Apr 24 14:41:32.598893 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.598870 2571 scope.go:117] "RemoveContainer" containerID="b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818" Apr 24 14:41:32.599225 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:41:32.599201 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818\": container with ID starting with b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818 not found: ID does not exist" containerID="b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818" Apr 24 14:41:32.599309 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.599233 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818"} err="failed to get container status \"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818\": rpc error: code = NotFound desc = could not find container \"b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818\": container with ID starting with b381c7f75082e349c4cad35cc34f800ee927ade27d7dd0c7befe6de265faa818 not found: ID does not exist" Apr 24 14:41:32.599309 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.599253 2571 scope.go:117] "RemoveContainer" containerID="539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb" Apr 24 14:41:32.599544 ip-10-0-138-116 kubenswrapper[2571]: E0424 14:41:32.599525 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb\": container with ID starting with 539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb not found: ID does not exist" containerID="539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb" Apr 24 14:41:32.599599 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:32.599556 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb"} err="failed to get container status \"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb\": rpc error: code = NotFound desc = could not find container \"539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb\": container with ID starting with 539cb5bf8c0b51233eed9d2f6c41815cc7dd2b7ca12e1568e3208c52e6d192eb not found: ID does not exist" Apr 24 14:41:34.228768 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:41:34.228732 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" path="/var/lib/kubelet/pods/3c0e742c-f090-438a-96f1-39d685b3dec6/volumes" Apr 24 14:44:14.181644 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:44:14.181619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:44:14.184009 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:44:14.183990 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:47:01.573521 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573484 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5668f86b7c-hzgf6"] Apr 24 14:47:01.573932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573845 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="storage-initializer" Apr 24 14:47:01.573932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573857 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="storage-initializer" Apr 24 14:47:01.573932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573874 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="main" Apr 24 14:47:01.573932 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573880 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="main" Apr 24 14:47:01.574050 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.573947 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c0e742c-f090-438a-96f1-39d685b3dec6" containerName="main" Apr 24 14:47:01.576891 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.576873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.579470 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.579442 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 14:47:01.579566 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.579465 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:47:01.579566 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.579469 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:47:01.579662 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.579455 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hxjpp\"" Apr 24 14:47:01.584978 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.584957 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5668f86b7c-hzgf6"] Apr 24 14:47:01.617389 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.617361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g6h8\" (UniqueName: \"kubernetes.io/projected/0d48fa20-44db-4656-9284-3d8b53aa7608-kube-api-access-6g6h8\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.617508 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.617406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d48fa20-44db-4656-9284-3d8b53aa7608-cert\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.718173 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.718142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d48fa20-44db-4656-9284-3d8b53aa7608-cert\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.718336 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.718239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g6h8\" (UniqueName: \"kubernetes.io/projected/0d48fa20-44db-4656-9284-3d8b53aa7608-kube-api-access-6g6h8\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.720505 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.720485 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d48fa20-44db-4656-9284-3d8b53aa7608-cert\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.725612 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.725584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g6h8\" (UniqueName: \"kubernetes.io/projected/0d48fa20-44db-4656-9284-3d8b53aa7608-kube-api-access-6g6h8\") pod \"llmisvc-controller-manager-5668f86b7c-hzgf6\" (UID: \"0d48fa20-44db-4656-9284-3d8b53aa7608\") " pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:01.887786 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:01.887696 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:02.007183 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:02.007152 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5668f86b7c-hzgf6"] Apr 24 14:47:02.009765 ip-10-0-138-116 kubenswrapper[2571]: W0424 14:47:02.009739 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d48fa20_44db_4656_9284_3d8b53aa7608.slice/crio-bad06558a2e342d4a04c51576484f85e4846ee836605b5e02958f04e21e1d6eb WatchSource:0}: Error finding container bad06558a2e342d4a04c51576484f85e4846ee836605b5e02958f04e21e1d6eb: Status 404 returned error can't find the container with id bad06558a2e342d4a04c51576484f85e4846ee836605b5e02958f04e21e1d6eb Apr 24 14:47:02.011066 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:02.011049 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:47:02.527827 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:02.527796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" event={"ID":"0d48fa20-44db-4656-9284-3d8b53aa7608","Type":"ContainerStarted","Data":"bad06558a2e342d4a04c51576484f85e4846ee836605b5e02958f04e21e1d6eb"} Apr 24 14:47:06.542626 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:06.542591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" event={"ID":"0d48fa20-44db-4656-9284-3d8b53aa7608","Type":"ContainerStarted","Data":"42fc7913e3dd95037704d8ab8780221f0b85ad1a80b15ab9900b86e36c840ae2"} Apr 24 14:47:06.543042 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:06.542704 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:47:06.564955 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:06.564903 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" podStartSLOduration=2.106377674 podStartE2EDuration="5.564887647s" podCreationTimestamp="2026-04-24 14:47:01 +0000 UTC" firstStartedPulling="2026-04-24 14:47:02.011190702 +0000 UTC m=+1368.342468444" lastFinishedPulling="2026-04-24 14:47:05.46970066 +0000 UTC m=+1371.800978417" observedRunningTime="2026-04-24 14:47:06.556244717 +0000 UTC m=+1372.887522481" watchObservedRunningTime="2026-04-24 14:47:06.564887647 +0000 UTC m=+1372.896165411" Apr 24 14:47:37.548553 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:47:37.548522 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5668f86b7c-hzgf6" Apr 24 14:49:14.203469 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:49:14.203436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:49:14.209367 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:49:14.209349 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:54:14.230373 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:54:14.230335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:54:14.235029 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:54:14.235005 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:59:14.252049 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:59:14.252024 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 14:59:14.257316 ip-10-0-138-116 kubenswrapper[2571]: I0424 14:59:14.257293 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 15:00:07.805904 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.805789 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9"] Apr 24 15:00:07.809759 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.809724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.812873 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.812847 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 15:00:07.812999 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.812894 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 15:00:07.812999 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.812920 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-pg7zz\"" Apr 24 15:00:07.812999 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.812970 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 15:00:07.821808 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.821787 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9"] Apr 24 15:00:07.928964 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.928928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f64d937-69b2-483f-9393-8287964d7163-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929136 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.928971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929136 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj24h\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-kube-api-access-wj24h\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929136 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929136 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929080 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929136 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929301 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f64d937-69b2-483f-9393-8287964d7163-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929301 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:07.929301 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:07.929197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.029855 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.029855 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f64d937-69b2-483f-9393-8287964d7163-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.029991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f64d937-69b2-483f-9393-8287964d7163-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030084 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj24h\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-kube-api-access-wj24h\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030484 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030484 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030484 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.030618 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.031013 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.030990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f64d937-69b2-483f-9393-8287964d7163-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.032332 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.032315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f64d937-69b2-483f-9393-8287964d7163-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.032518 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.032495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f64d937-69b2-483f-9393-8287964d7163-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.037527 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.037504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.037634 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.037529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj24h\" (UniqueName: \"kubernetes.io/projected/0f64d937-69b2-483f-9393-8287964d7163-kube-api-access-wj24h\") pod \"router-gateway-2-openshift-default-6866b85949-nt4x9\" (UID: \"0f64d937-69b2-483f-9393-8287964d7163\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.124919 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.124827 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:08.264219 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.264191 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9"] Apr 24 15:00:08.268135 ip-10-0-138-116 kubenswrapper[2571]: W0424 15:00:08.268092 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f64d937_69b2_483f_9393_8287964d7163.slice/crio-49f09563986ec274e7871308ca2df90dca38c71455a7ac5fe5922c5660c3a4f7 WatchSource:0}: Error finding container 49f09563986ec274e7871308ca2df90dca38c71455a7ac5fe5922c5660c3a4f7: Status 404 returned error can't find the container with id 49f09563986ec274e7871308ca2df90dca38c71455a7ac5fe5922c5660c3a4f7 Apr 24 15:00:08.270367 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:08.270349 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:00:09.197981 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:09.197948 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" event={"ID":"0f64d937-69b2-483f-9393-8287964d7163","Type":"ContainerStarted","Data":"49f09563986ec274e7871308ca2df90dca38c71455a7ac5fe5922c5660c3a4f7"} Apr 24 15:00:14.133169 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:14.133131 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 15:00:14.133446 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:14.133213 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 15:00:14.133446 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:14.133269 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 15:00:14.216513 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:14.216475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" event={"ID":"0f64d937-69b2-483f-9393-8287964d7163","Type":"ContainerStarted","Data":"55459d4340755e85285924ecc0e52fa8335284d1c1336b5a5323937dac790c94"} Apr 24 15:00:14.236115 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:14.236036 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" podStartSLOduration=1.373669157 podStartE2EDuration="7.236018211s" podCreationTimestamp="2026-04-24 15:00:07 +0000 UTC" firstStartedPulling="2026-04-24 15:00:08.270507062 +0000 UTC m=+2154.601784809" lastFinishedPulling="2026-04-24 15:00:14.132856116 +0000 UTC m=+2160.464133863" observedRunningTime="2026-04-24 15:00:14.23333799 +0000 UTC m=+2160.564615755" watchObservedRunningTime="2026-04-24 15:00:14.236018211 +0000 UTC m=+2160.567295976" Apr 24 15:00:15.125082 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:15.125036 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:15.126606 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:15.126565 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" podUID="0f64d937-69b2-483f-9393-8287964d7163" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 24 15:00:16.125582 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:16.125531 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" podUID="0f64d937-69b2-483f-9393-8287964d7163" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 24 15:00:17.126075 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:17.126034 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" podUID="0f64d937-69b2-483f-9393-8287964d7163" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 24 15:00:18.129163 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:18.129131 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:18.129540 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:18.129403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:00:18.130165 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:00:18.130146 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nt4x9" Apr 24 15:04:14.276804 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:04:14.276764 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 15:04:14.281688 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:04:14.281666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 15:04:58.299297 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:04:58.299221 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:04:59.293761 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:04:59.293735 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:00.250697 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:00.250666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:01.183383 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:01.183350 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:02.123432 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:02.123402 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:03.065240 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:03.065206 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:04.014092 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:04.014056 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:04.942723 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:04.942690 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:05.858980 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:05.858952 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:06.811265 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:06.811234 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:07.743717 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:07.743689 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:08.682332 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:08.682287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:09.633617 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:09.633589 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:10.642901 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:10.642868 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nt4x9_0f64d937-69b2-483f-9393-8287964d7163/istio-proxy/0.log" Apr 24 15:05:11.583627 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:11.583578 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6d7fr_35817697-9201-4809-ab68-ca05ef564674/discovery/0.log" Apr 24 15:05:11.611420 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:11.611390 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d8799dfd8-qtxtc_de813c8e-7fcb-4f67-b2eb-58050b724a12/router/0.log" Apr 24 15:05:12.346211 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:12.346172 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6d7fr_35817697-9201-4809-ab68-ca05ef564674/discovery/0.log" Apr 24 15:05:12.378801 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:12.378770 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d8799dfd8-qtxtc_de813c8e-7fcb-4f67-b2eb-58050b724a12/router/0.log" Apr 24 15:05:13.121830 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:13.121804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ghmw2_34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c/kuadrant-console-plugin/0.log" Apr 24 15:05:18.432577 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:18.432542 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xvzvc_67e8ae98-c8a1-486c-98fc-9b7d3a11c174/global-pull-secret-syncer/0.log" Apr 24 15:05:18.495847 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:18.495802 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f25sj_1e92bec3-9630-4928-b6e4-dcf3fbc8dd82/konnectivity-agent/0.log" Apr 24 15:05:18.582423 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:18.582387 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-116.ec2.internal_63ded8f515ead39ec1575ef940918d4c/haproxy/0.log" Apr 24 15:05:22.102029 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:22.102001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ghmw2_34a6b42d-8a3a-4a39-ae2a-5db86c28eb7c/kuadrant-console-plugin/0.log" Apr 24 15:05:23.765109 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.765078 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvj56_cd857084-4b07-4387-b3f1-e5478ef38bdc/node-exporter/0.log" Apr 24 15:05:23.785971 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.785942 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvj56_cd857084-4b07-4387-b3f1-e5478ef38bdc/kube-rbac-proxy/0.log" Apr 24 15:05:23.808086 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.808057 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvj56_cd857084-4b07-4387-b3f1-e5478ef38bdc/init-textfile/0.log" Apr 24 15:05:23.918496 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.918415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/prometheus/0.log" Apr 24 15:05:23.936569 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.936543 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/config-reloader/0.log" Apr 24 15:05:23.961717 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.961693 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/thanos-sidecar/0.log" Apr 24 15:05:23.983231 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:23.983204 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/kube-rbac-proxy-web/0.log" Apr 24 15:05:24.006912 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.006884 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/kube-rbac-proxy/0.log" Apr 24 15:05:24.030562 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.030537 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/kube-rbac-proxy-thanos/0.log" Apr 24 15:05:24.052080 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.052059 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2654f68e-fd75-44a5-9479-64d02268d0e2/init-config-reloader/0.log" Apr 24 15:05:24.080711 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.080680 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fgjh4_22aa473c-cf15-4f73-b224-47edf2e4211a/prometheus-operator/0.log" Apr 24 15:05:24.104626 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.104600 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fgjh4_22aa473c-cf15-4f73-b224-47edf2e4211a/kube-rbac-proxy/0.log" Apr 24 15:05:24.136348 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.136324 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-mtp5z_f598951f-f7af-4599-aef7-cf715800fb86/prometheus-operator-admission-webhook/0.log" Apr 24 15:05:24.233792 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.233720 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/thanos-query/0.log" Apr 24 15:05:24.263029 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.262992 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/kube-rbac-proxy-web/0.log" Apr 24 15:05:24.279284 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.279251 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/kube-rbac-proxy/0.log" Apr 24 15:05:24.301022 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.300997 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/prom-label-proxy/0.log" Apr 24 15:05:24.324218 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.324193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/kube-rbac-proxy-rules/0.log" Apr 24 15:05:24.351366 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:24.351340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf7658d46-wp8rp_dde381ac-d51e-4fed-80be-b149847b7866/kube-rbac-proxy-metrics/0.log" Apr 24 15:05:26.739565 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:26.739533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4f96dc7d-vjcjd_98592155-aa17-418e-8b08-a8a8055921ce/console/0.log" Apr 24 15:05:26.771962 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:26.771922 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tpm5c_75f88656-05dd-4670-8fee-421667219118/download-server/0.log" Apr 24 15:05:27.741042 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.741004 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84"] Apr 24 15:05:27.744976 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.744951 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.747896 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.747877 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rpb9c\"/\"kube-root-ca.crt\"" Apr 24 15:05:27.748009 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.747975 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rpb9c\"/\"openshift-service-ca.crt\"" Apr 24 15:05:27.748385 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.748371 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rpb9c\"/\"default-dockercfg-rtxl9\"" Apr 24 15:05:27.752067 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.751984 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84"] Apr 24 15:05:27.802058 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.802024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-lib-modules\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.802247 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.802094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-proc\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.802247 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.802175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-podres\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.802247 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.802215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v679g\" (UniqueName: \"kubernetes.io/projected/37f44c9a-a159-4884-b485-aab3044770b1-kube-api-access-v679g\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.802351 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.802255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-sys\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903350 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-proc\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903350 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-podres\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v679g\" (UniqueName: \"kubernetes.io/projected/37f44c9a-a159-4884-b485-aab3044770b1-kube-api-access-v679g\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-sys\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-lib-modules\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903460 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-proc\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-podres\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903553 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-sys\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.903731 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.903588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f44c9a-a159-4884-b485-aab3044770b1-lib-modules\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:27.910634 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:27.910610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v679g\" (UniqueName: \"kubernetes.io/projected/37f44c9a-a159-4884-b485-aab3044770b1-kube-api-access-v679g\") pod \"perf-node-gather-daemonset-wtv84\" (UID: \"37f44c9a-a159-4884-b485-aab3044770b1\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:28.055536 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.055495 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:28.059890 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.059867 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z29nv_92dbdc96-9b06-45f2-9e4e-317abc345922/dns/0.log" Apr 24 15:05:28.079611 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.079577 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z29nv_92dbdc96-9b06-45f2-9e4e-317abc345922/kube-rbac-proxy/0.log" Apr 24 15:05:28.103747 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.103717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2rs7b_2ded180c-8601-4aaf-86bd-6a13b101faa8/dns-node-resolver/0.log" Apr 24 15:05:28.180003 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.179923 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84"] Apr 24 15:05:28.183027 ip-10-0-138-116 kubenswrapper[2571]: W0424 15:05:28.182992 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod37f44c9a_a159_4884_b485_aab3044770b1.slice/crio-3aff1439424ad9b379022ee2876646e072c1be2712fd3088541dd5270ba9ab8f WatchSource:0}: Error finding container 3aff1439424ad9b379022ee2876646e072c1be2712fd3088541dd5270ba9ab8f: Status 404 returned error can't find the container with id 3aff1439424ad9b379022ee2876646e072c1be2712fd3088541dd5270ba9ab8f Apr 24 15:05:28.185063 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.185047 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:05:28.295235 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.295203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" event={"ID":"37f44c9a-a159-4884-b485-aab3044770b1","Type":"ContainerStarted","Data":"e18f8cc2588953662746db111d6fcd495f5cbff5b6c3b070e1baec232ca56703"} Apr 24 15:05:28.295235 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.295238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" event={"ID":"37f44c9a-a159-4884-b485-aab3044770b1","Type":"ContainerStarted","Data":"3aff1439424ad9b379022ee2876646e072c1be2712fd3088541dd5270ba9ab8f"} Apr 24 15:05:28.295449 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.295265 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:28.309864 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.309768 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" podStartSLOduration=1.3097517810000001 podStartE2EDuration="1.309751781s" podCreationTimestamp="2026-04-24 15:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:05:28.308985266 +0000 UTC m=+2474.640263031" watchObservedRunningTime="2026-04-24 15:05:28.309751781 +0000 UTC m=+2474.641029546" Apr 24 15:05:28.572030 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.571949 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5d475cf49c-4txv4_3c497767-0280-48b6-a885-0915f5cc5c12/registry/0.log" Apr 24 15:05:28.627590 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:28.627560 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tlvj4_d2eb35e3-b76b-433c-b90d-a4481a2cd709/node-ca/0.log" Apr 24 15:05:29.383045 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:29.383009 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6d7fr_35817697-9201-4809-ab68-ca05ef564674/discovery/0.log" Apr 24 15:05:29.423751 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:29.423725 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d8799dfd8-qtxtc_de813c8e-7fcb-4f67-b2eb-58050b724a12/router/0.log" Apr 24 15:05:29.827801 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:29.827775 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5gpks_aa089429-959e-4b12-bddb-1d6d0ce963c9/serve-healthcheck-canary/0.log" Apr 24 15:05:30.270360 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:30.270333 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8blnj_dede47ae-02f2-408e-947c-484180d89394/insights-operator/0.log" Apr 24 15:05:30.270530 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:30.270456 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8blnj_dede47ae-02f2-408e-947c-484180d89394/insights-operator/1.log" Apr 24 15:05:30.387004 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:30.386979 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8s9x_6836012d-7ba1-41cd-a9e7-bac6c71b7852/kube-rbac-proxy/0.log" Apr 24 15:05:30.406221 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:30.406193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8s9x_6836012d-7ba1-41cd-a9e7-bac6c71b7852/exporter/0.log" Apr 24 15:05:30.425359 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:30.425334 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z8s9x_6836012d-7ba1-41cd-a9e7-bac6c71b7852/extractor/0.log" Apr 24 15:05:32.878967 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:32.878939 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6d967bb649-24ssb_b75e0293-0089-469d-a700-637aba0a6951/manager/0.log" Apr 24 15:05:32.919989 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:32.919954 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-vznf7_ccea0364-b805-4174-8095-2979e31415e6/openshift-lws-operator/0.log" Apr 24 15:05:33.444791 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:33.444761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5668f86b7c-hzgf6_0d48fa20-44db-4656-9284-3d8b53aa7608/manager/0.log" Apr 24 15:05:34.309736 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:34.309707 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-wtv84" Apr 24 15:05:39.591370 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.591337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/kube-multus-additional-cni-plugins/0.log" Apr 24 15:05:39.611558 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.611488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/egress-router-binary-copy/0.log" Apr 24 15:05:39.634523 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.634489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/cni-plugins/0.log" Apr 24 15:05:39.655037 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.655010 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/bond-cni-plugin/0.log" Apr 24 15:05:39.674112 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.674061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/routeoverride-cni/0.log" Apr 24 15:05:39.694916 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.694886 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/whereabouts-cni-bincopy/0.log" Apr 24 15:05:39.713241 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.713213 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m79vj_aace953f-49d0-4c47-9522-d00bf8dece62/whereabouts-cni/0.log" Apr 24 15:05:39.762698 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.762666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jxm64_75370bd4-7795-4ebf-8a12-27eda2d9b1d7/kube-multus/0.log" Apr 24 15:05:39.837678 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.837652 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dkhdd_7ed1658e-98f8-4fe9-bb01-60b235015d4b/network-metrics-daemon/0.log" Apr 24 15:05:39.856636 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:39.856606 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dkhdd_7ed1658e-98f8-4fe9-bb01-60b235015d4b/kube-rbac-proxy/0.log" Apr 24 15:05:40.652577 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.652546 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-controller/0.log" Apr 24 15:05:40.671126 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.671083 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/0.log" Apr 24 15:05:40.686506 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.686477 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovn-acl-logging/1.log" Apr 24 15:05:40.705531 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.705502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/kube-rbac-proxy-node/0.log" Apr 24 15:05:40.727484 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.727454 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:05:40.744900 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.744872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/northd/0.log" Apr 24 15:05:40.767499 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.767476 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/nbdb/0.log" Apr 24 15:05:40.790819 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.790794 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/sbdb/0.log" Apr 24 15:05:40.890805 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:40.890778 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ksfw_151dbb1d-0d3a-4890-8076-f774d13b7e70/ovnkube-controller/0.log" Apr 24 15:05:42.435457 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:42.435420 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jlk8v_0a1eaa98-906e-4458-8492-83342d8bdd0f/network-check-target-container/0.log" Apr 24 15:05:43.364791 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:43.364763 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hwhv7_7b650bb0-88a4-4b81-a9d8-a1f2b16a8c46/iptables-alerter/0.log" Apr 24 15:05:44.095968 ip-10-0-138-116 kubenswrapper[2571]: I0424 15:05:44.095936 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xgvkw_3307a337-f7bb-48ac-bb80-128ee9a46983/tuned/0.log"