Apr 20 14:59:46.488011 ip-10-0-129-115 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:59:46.488026 ip-10-0-129-115 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:59:46.488036 ip-10-0-129-115 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:59:46.488398 ip-10-0-129-115 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:59:56.497845 ip-10-0-129-115 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:59:56.497862 ip-10-0-129-115 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f5fb2ec2d5e34c33a7ad472dd3c4851e -- Apr 20 15:02:29.432652 ip-10-0-129-115 systemd[1]: Starting Kubernetes Kubelet... Apr 20 15:02:29.930152 ip-10-0-129-115 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:29.930152 ip-10-0-129-115 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 15:02:29.930152 ip-10-0-129-115 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:29.930152 ip-10-0-129-115 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 15:02:29.930152 ip-10-0-129-115 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:29.933259 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.933095 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 15:02:29.936548 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936528 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:29.936548 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936547 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936551 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936555 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936558 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936561 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936565 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936574 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936577 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936580 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936583 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936585 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936588 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936591 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936594 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936597 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936600 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936602 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936605 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936608 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936611 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:29.936622 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936614 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936617 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936619 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936622 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936625 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936628 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936631 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936634 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936665 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936671 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936674 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936677 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936679 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936682 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936685 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936688 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936691 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936693 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936696 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:29.937096 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936698 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936701 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936703 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936706 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936708 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936711 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936713 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936716 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936718 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936721 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936723 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936726 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936730 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936734 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936737 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936741 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936744 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936747 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936750 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936753 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:29.937614 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936755 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936758 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936762 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936765 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936767 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936770 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936778 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936780 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936783 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936786 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936789 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936791 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936794 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936796 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936799 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936801 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936804 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936806 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936809 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936811 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936814 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:29.938124 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936817 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936819 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936821 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936824 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.936827 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937240 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937246 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937250 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937253 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937255 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937258 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937261 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937264 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937266 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937269 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937272 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937274 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937296 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937299 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937303 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:29.938660 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937306 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937309 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937312 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937314 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937317 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937319 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937322 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937326 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937329 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937332 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937334 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937337 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937340 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937342 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937345 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937347 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937350 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937352 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937357 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937360 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:29.939237 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937363 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937365 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937368 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937370 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937374 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937377 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937380 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937383 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937385 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937388 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937391 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937393 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937399 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937402 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937404 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937407 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937410 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937413 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937415 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937418 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:29.939799 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937420 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937423 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937425 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937427 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937430 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937432 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937435 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937438 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937441 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937443 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937446 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937450 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937452 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937455 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937458 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937460 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937463 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937466 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937468 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937471 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:29.940303 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937473 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937475 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937478 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937481 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937485 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937489 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937492 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937495 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937498 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937500 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.937503 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938752 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938763 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938777 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938787 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938792 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938796 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938801 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938806 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938810 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 15:02:29.940793 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938813 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938817 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938821 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938824 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938828 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938831 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938834 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938837 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938840 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938843 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938848 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938851 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938854 2577 flags.go:64] FLAG: --config-dir="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938857 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938861 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938865 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938868 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938871 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938876 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938879 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938882 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938885 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938889 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938892 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938896 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 15:02:29.941344 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938899 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938903 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938906 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938909 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938912 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938920 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938923 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938926 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938930 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938933 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938937 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938940 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938943 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938947 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938950 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938953 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938956 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938959 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938962 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938965 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938968 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938972 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938975 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938979 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938982 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938986 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 15:02:29.941966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938989 2577 flags.go:64] FLAG: --help="false" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938993 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-129-115.ec2.internal" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938996 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.938999 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939002 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939006 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939009 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939012 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939015 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939018 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939024 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939027 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939031 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939034 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939037 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939039 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939043 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939046 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939049 2577 flags.go:64] FLAG: --lock-file="" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939052 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939054 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939058 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939063 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939066 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 15:02:29.942640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939069 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939071 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939075 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939078 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939081 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939084 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939089 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939092 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939096 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939100 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939103 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939106 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939109 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939112 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939115 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939118 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939127 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939130 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939135 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939138 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939141 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939162 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939166 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939170 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 15:02:29.943210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939174 2577 flags.go:64] FLAG: --port="10250" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939177 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939180 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-053bc78fa52d1213a" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939184 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939187 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939190 2577 flags.go:64] FLAG: --register-node="true" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939193 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939196 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939200 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939203 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939206 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939208 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939212 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939216 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939219 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939222 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939224 2577 flags.go:64] FLAG: --runonce="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939227 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939231 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939234 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939237 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939240 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939243 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939247 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939251 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939254 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 15:02:29.943832 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939259 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939262 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939265 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939268 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939271 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939274 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939280 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939295 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939299 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939303 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939306 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939309 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939312 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939315 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939318 2577 flags.go:64] FLAG: --v="2" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939322 2577 flags.go:64] FLAG: --version="false" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939326 2577 flags.go:64] FLAG: --vmodule="" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939331 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.939334 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939455 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939460 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939464 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939467 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939470 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:29.944475 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939473 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939476 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939479 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939481 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939484 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939487 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939490 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939493 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939497 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939500 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939503 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939506 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939509 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939511 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939514 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939517 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939519 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939522 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939524 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939527 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:29.945048 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939531 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939535 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939539 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939542 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939545 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939548 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939551 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939554 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939556 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939559 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939562 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939564 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939567 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939569 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939572 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939575 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939577 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939580 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939586 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:29.945579 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939593 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939596 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939598 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939601 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939603 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939606 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939609 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939613 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939617 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939620 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939622 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939625 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939628 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939630 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939633 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939636 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939638 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939641 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939643 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939646 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:29.946139 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939649 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939652 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939654 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939657 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939659 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939662 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939664 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939668 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939670 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939673 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939677 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939680 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939684 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939687 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939690 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939692 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939695 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939697 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939700 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939703 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:29.946934 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.939705 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:29.947735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.940492 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:29.948537 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.948515 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 15:02:29.948579 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.948538 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 15:02:29.948610 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948594 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:29.948610 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948600 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:29.948610 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948604 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:29.948610 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948607 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:29.948610 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948611 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948615 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948619 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948622 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948625 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948628 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948631 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948634 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948639 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948643 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948646 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948649 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948652 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948673 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948677 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948680 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948683 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948686 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948689 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948692 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:29.948740 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948695 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948698 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948701 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948703 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948706 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948709 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948712 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948714 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948717 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948720 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948722 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948725 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948728 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948730 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948734 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948736 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948739 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948742 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948746 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:29.949240 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948751 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948754 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948757 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948759 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948762 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948765 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948768 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948771 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948774 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948776 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948779 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948781 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948784 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948787 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948789 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948792 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948794 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948797 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948800 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948803 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:29.949722 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948806 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948808 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948811 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948814 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948817 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948819 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948822 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948825 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948828 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948831 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948833 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948836 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948838 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948842 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948845 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948847 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948850 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948867 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948870 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948874 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:29.950210 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948876 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948879 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948883 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.948888 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.948995 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949000 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949004 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949007 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949009 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949012 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949015 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949018 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949021 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949024 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949026 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:29.950720 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949029 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949031 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949034 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949036 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949040 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949042 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949045 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949048 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949051 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949053 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949056 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949060 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949065 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949068 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949071 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949074 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949076 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949079 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949081 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949084 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:29.951091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949086 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949089 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949091 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949094 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949096 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949101 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949105 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949108 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949111 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949114 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949117 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949119 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949122 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949125 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949127 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949130 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949133 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949136 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949138 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949141 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:29.951618 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949143 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949146 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949149 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949152 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949155 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949157 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949160 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949162 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949165 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949167 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949169 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949172 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949175 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949177 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949180 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949182 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949185 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949188 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949191 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949193 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:29.952102 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949196 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949200 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949203 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949205 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949208 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949210 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949213 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949215 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949218 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949220 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949222 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949225 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949227 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949230 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:29.949232 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:29.952607 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.949238 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:29.953028 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.950020 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 15:02:29.954424 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.954407 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 15:02:29.955327 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.955316 2577 server.go:1019] "Starting client certificate rotation" Apr 20 15:02:29.955433 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.955415 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:29.955472 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.955456 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:29.984966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.984941 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:29.987745 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:29.987705 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:30.003604 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.003577 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 15:02:30.013633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.013603 2577 log.go:25] "Validated CRI v1 image API" Apr 20 15:02:30.015576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.015551 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:30.015576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.015553 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 15:02:30.019197 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.019169 2577 fs.go:135] Filesystem UUIDs: map[444a6e2e-52e0-4705-a399-b2e52ade37c0:/dev/nvme0n1p4 5860190a-504a-4330-a9be-4e32562ccc94:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 15:02:30.019301 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.019192 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 15:02:30.024521 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.024395 2577 manager.go:217] Machine: {Timestamp:2026-04-20 15:02:30.023247626 +0000 UTC m=+0.458791772 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100324 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2aac232b5af58996fdbc7b0bb64646 SystemUUID:ec2aac23-2b5a-f589-96fd-bc7b0bb64646 BootID:f5fb2ec2-d5e3-4c33-a7ad-472dd3c4851e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c1:18:25:f0:85 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c1:18:25:f0:85 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:70:93:73:5b:78 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 15:02:30.024521 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.024509 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 15:02:30.024717 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.024632 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 15:02:30.026417 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.026379 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 15:02:30.026604 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.026421 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-115.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 15:02:30.026682 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.026623 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 15:02:30.026682 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.026637 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 15:02:30.026682 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.026656 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:30.027411 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.027397 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:30.028329 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.028316 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:30.028472 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.028460 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 15:02:30.031029 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.031017 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 15:02:30.031087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.031037 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 15:02:30.031087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.031055 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 15:02:30.031087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.031071 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 15:02:30.031087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.031087 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 15:02:30.032413 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.032398 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:30.032490 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.032423 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:30.033675 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.033653 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hn6mx" Apr 20 15:02:30.035751 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.035732 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 15:02:30.037340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.037326 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 15:02:30.039243 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039221 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039250 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039263 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039273 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039296 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039303 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039309 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039315 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039325 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039331 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 15:02:30.039340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039340 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 15:02:30.039621 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.039350 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 15:02:30.040313 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.040301 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 15:02:30.040350 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.040314 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 15:02:30.041931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.041912 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hn6mx" Apr 20 15:02:30.042752 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.042735 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-115.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 15:02:30.043749 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.043721 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 15:02:30.043749 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.043732 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 15:02:30.044226 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.044214 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 15:02:30.044308 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.044296 2577 server.go:1295] "Started kubelet" Apr 20 15:02:30.044371 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.044340 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 15:02:30.044504 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.044448 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 15:02:30.044568 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.044534 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 15:02:30.045215 ip-10-0-129-115 systemd[1]: Started Kubernetes Kubelet. Apr 20 15:02:30.047239 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.047225 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 15:02:30.048185 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.048168 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 15:02:30.054892 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.054866 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:30.055396 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.055367 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 15:02:30.057203 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057095 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 15:02:30.057302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057205 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 15:02:30.057302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057253 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 15:02:30.057486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057467 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 15:02:30.057486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057486 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 15:02:30.057687 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.057666 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 15:02:30.057748 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057708 2577 factory.go:55] Registering systemd factory Apr 20 15:02:30.057857 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.057817 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.057910 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.057897 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 15:02:30.058316 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.058297 2577 factory.go:153] Registering CRI-O factory Apr 20 15:02:30.058396 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.058319 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 15:02:30.058463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.058396 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 15:02:30.058463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.058417 2577 factory.go:103] Registering Raw factory Apr 20 15:02:30.058463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.058432 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 15:02:30.059241 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.059217 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:30.061458 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.061434 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-115.ec2.internal\" not found" node="ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.061640 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.061625 2577 manager.go:319] Starting recovery of all containers Apr 20 15:02:30.071782 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.071634 2577 manager.go:324] Recovery completed Apr 20 15:02:30.076576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.076558 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.079297 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079271 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.079382 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079325 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.079382 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079337 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.079934 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079920 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 15:02:30.079934 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079932 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 15:02:30.080031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.079950 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:30.082486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.082474 2577 policy_none.go:49] "None policy: Start" Apr 20 15:02:30.082532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.082490 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 15:02:30.082532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.082501 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.120965 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.121010 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121021 2577 server.go:85] "Starting device plugin registration server" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121324 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121339 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121537 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121616 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.121622 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.122140 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 15:02:30.124901 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.122182 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.189381 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.189279 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 15:02:30.190466 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.190448 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 15:02:30.190515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.190479 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 15:02:30.190515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.190500 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 15:02:30.190515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.190506 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 15:02:30.190618 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.190540 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 15:02:30.192909 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.192881 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:30.221844 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.221808 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.223690 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.223670 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.223828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.223706 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.223828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.223721 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.223828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.223753 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.232891 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.232867 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.232999 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.232896 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-115.ec2.internal\": node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.247942 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.247912 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.290990 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.290935 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal"] Apr 20 15:02:30.291154 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.291066 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.292816 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.292798 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.292906 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.292836 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.292906 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.292850 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.294508 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.294491 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.294667 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.294651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.294716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.294685 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.296336 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.295902 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.296336 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.295935 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.296336 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.295957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.296681 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.296661 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.296758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.296696 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.296758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.296712 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.297383 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.297365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.297463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.297393 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:30.298300 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.298270 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:30.298370 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.298317 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:30.298370 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.298333 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:30.322184 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.322165 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-115.ec2.internal\" not found" node="ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.326833 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.326814 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-115.ec2.internal\" not found" node="ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.348630 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.348604 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.359464 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.359437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ec3cd5b714ec0026f823b211f468c89-config\") pod \"kube-apiserver-proxy-ip-10-0-129-115.ec2.internal\" (UID: \"6ec3cd5b714ec0026f823b211f468c89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.359531 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.359469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.359531 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.359492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.449023 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.448951 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.460406 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.460467 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ec3cd5b714ec0026f823b211f468c89-config\") pod \"kube-apiserver-proxy-ip-10-0-129-115.ec2.internal\" (UID: \"6ec3cd5b714ec0026f823b211f468c89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.460467 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.460530 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460485 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.460565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bc6bbe5a7166ac8cdce602f3f8f7acd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal\" (UID: \"bc6bbe5a7166ac8cdce602f3f8f7acd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.460565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.460490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ec3cd5b714ec0026f823b211f468c89-config\") pod \"kube-apiserver-proxy-ip-10-0-129-115.ec2.internal\" (UID: \"6ec3cd5b714ec0026f823b211f468c89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.549792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.549746 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.624338 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.624307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.630053 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.630029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:30.650933 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.650882 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.751704 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.751661 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.852261 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.852233 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.952774 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:30.952744 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:30.954907 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.954887 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 15:02:30.955078 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.955050 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 15:02:30.955143 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:30.955078 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 15:02:31.043987 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.043899 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:57:30 +0000 UTC" deadline="2027-11-19 18:25:28.600891214 +0000 UTC" Apr 20 15:02:31.043987 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.043938 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13875h22m57.556956606s" Apr 20 15:02:31.054038 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:31.053999 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-115.ec2.internal\" not found" Apr 20 15:02:31.055131 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.055113 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:31.065996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.065968 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:31.078647 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.078626 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:31.088403 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.088379 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2cfhk" Apr 20 15:02:31.088527 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:31.088396 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6bbe5a7166ac8cdce602f3f8f7acd2.slice/crio-e249ff946581a397d46f165e6958a2188dd0b5aa31446bf367ba16e9ba4a0333 WatchSource:0}: Error finding container e249ff946581a397d46f165e6958a2188dd0b5aa31446bf367ba16e9ba4a0333: Status 404 returned error can't find the container with id e249ff946581a397d46f165e6958a2188dd0b5aa31446bf367ba16e9ba4a0333 Apr 20 15:02:31.088806 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:31.088789 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec3cd5b714ec0026f823b211f468c89.slice/crio-bfb664343482f8462519902456a8b0b368a68f84ecf94feb7474a4214ff9ff6e WatchSource:0}: Error finding container bfb664343482f8462519902456a8b0b368a68f84ecf94feb7474a4214ff9ff6e: Status 404 returned error can't find the container with id bfb664343482f8462519902456a8b0b368a68f84ecf94feb7474a4214ff9ff6e Apr 20 15:02:31.094161 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.094142 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2cfhk" Apr 20 15:02:31.095177 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.095162 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:02:31.116549 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.116523 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:31.156570 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.156544 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" Apr 20 15:02:31.167992 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.167967 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:31.168958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.168944 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" Apr 20 15:02:31.177099 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.177083 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:31.194218 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.194166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" event={"ID":"bc6bbe5a7166ac8cdce602f3f8f7acd2","Type":"ContainerStarted","Data":"e249ff946581a397d46f165e6958a2188dd0b5aa31446bf367ba16e9ba4a0333"} Apr 20 15:02:31.195047 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.195016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" event={"ID":"6ec3cd5b714ec0026f823b211f468c89","Type":"ContainerStarted","Data":"bfb664343482f8462519902456a8b0b368a68f84ecf94feb7474a4214ff9ff6e"} Apr 20 15:02:31.798105 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:31.798071 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:32.032023 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.031991 2577 apiserver.go:52] "Watching apiserver" Apr 20 15:02:32.038828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.038797 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 15:02:32.039177 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.039155 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8","openshift-image-registry/node-ca-7vhjb","openshift-multus/multus-additional-cni-plugins-cx4lv","openshift-cluster-node-tuning-operator/tuned-9jslm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal","openshift-multus/multus-mm98x","openshift-multus/network-metrics-daemon-dp887","openshift-network-diagnostics/network-check-target-z8nvj","openshift-network-operator/iptables-alerter-f6vw2","openshift-ovn-kubernetes/ovnkube-node-x9pxn","kube-system/konnectivity-agent-kztg4","kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal"] Apr 20 15:02:32.041101 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.041071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.043077 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.042964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.043722 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.043699 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.043902 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.043881 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 15:02:32.043998 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.043933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jbg2c\"" Apr 20 15:02:32.044519 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.044187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.045105 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.044880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.045219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.045121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 15:02:32.045463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.045447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-drcvv\"" Apr 20 15:02:32.045570 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.045505 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.047558 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.047222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.049827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.049765 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrl2v\"" Apr 20 15:02:32.049827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.049778 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.049827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.049776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.050030 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.049874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.050570 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.050545 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 15:02:32.050855 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.050835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 15:02:32.050933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.050874 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 15:02:32.051456 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.051239 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:32.051456 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.051270 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.051805 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.051784 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tfhgl\"" Apr 20 15:02:32.051925 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.051902 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.052185 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.052007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 15:02:32.052185 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.052025 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:32.052185 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.052049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.052185 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.052105 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.052489 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.052468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-z2vjb\"" Apr 20 15:02:32.053517 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.053389 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:32.053517 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.053464 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:32.057144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.057071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.058982 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.058963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.059078 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.058990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.059505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.059487 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 15:02:32.059637 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.059540 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fgj9r\"" Apr 20 15:02:32.059869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.059852 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.059951 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.059901 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.061173 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061156 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 15:02:32.061298 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 15:02:32.061546 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 15:02:32.061670 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061650 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 15:02:32.061926 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 15:02:32.062005 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.061944 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 15:02:32.062058 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.062007 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 15:02:32.062117 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.062100 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cb8qj\"" Apr 20 15:02:32.062165 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.062134 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg2g6\"" Apr 20 15:02:32.062211 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.062172 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 15:02:32.068755 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-system-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.068870 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bv86\" (UniqueName: \"kubernetes.io/projected/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-kube-api-access-9bv86\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.068870 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-ovn\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.068870 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-socket-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.069025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-sys-fs\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.069025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-netns\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-hostroot\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.068994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-etc-kubernetes\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xcfk\" (UniqueName: \"kubernetes.io/projected/5987592a-660d-4466-bbc4-5bd812cca838-kube-api-access-8xcfk\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.069215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-node-log\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-config\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-modprobe-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.069215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-os-release\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-var-lib-kubelet\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-socket-dir-parent\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-kubelet\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d301008-567f-46b2-9fbb-f3d7ea2a6456-iptables-alerter-script\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-kubelet\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-var-lib-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069463 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-env-overrides\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-conf\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-host\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-konnectivity-ca\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-registration-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-etc-selinux\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b19ab41-5f62-4594-9262-8789718fb9e9-serviceca\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysconfig\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-k8s-cni-cncf-io\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069731 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftq9c\" (UniqueName: \"kubernetes.io/projected/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-kube-api-access-ftq9c\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.069789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cnibin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-netd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovn-node-metrics-cert\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.069993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-tmp\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-multus\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9smtg\" (UniqueName: \"kubernetes.io/projected/7d301008-567f-46b2-9fbb-f3d7ea2a6456-kube-api-access-9smtg\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-slash\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-systemd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-system-cni-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-etc-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-systemd\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-sys\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-conf-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-multus-certs\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzn9\" (UniqueName: \"kubernetes.io/projected/3b19ab41-5f62-4594-9262-8789718fb9e9-kube-api-access-hjzn9\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-os-release\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-device-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgpp\" (UniqueName: \"kubernetes.io/projected/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kube-api-access-fzgpp\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-run\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-tuned\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cni-binary-copy\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-agent-certs\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d301008-567f-46b2-9fbb-f3d7ea2a6456-host-slash\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-log-socket\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-kubernetes\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-daemon-config\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.070980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-systemd-units\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-netns\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-script-lib\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cnibin\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.070997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-lib-modules\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-bin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-bin\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b19ab41-5f62-4594-9262-8789718fb9e9-host\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfkn6\" (UniqueName: \"kubernetes.io/projected/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-kube-api-access-jfkn6\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.071713 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.071245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jjm\" (UniqueName: \"kubernetes.io/projected/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-kube-api-access-j4jjm\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.094914 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.094875 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:31 +0000 UTC" deadline="2027-11-16 22:55:43.366318856 +0000 UTC" Apr 20 15:02:32.094914 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.094910 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13807h53m11.271414476s" Apr 20 15:02:32.157970 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.157933 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 15:02:32.171596 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-modprobe-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.171596 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-os-release\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-var-lib-kubelet\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-socket-dir-parent\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-kubelet\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171705 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d301008-567f-46b2-9fbb-f3d7ea2a6456-iptables-alerter-script\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-var-lib-kubelet\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-kubelet\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-modprobe-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-kubelet\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-var-lib-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-os-release\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-env-overrides\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-var-lib-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-socket-dir-parent\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.171842 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-conf\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-host\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-kubelet\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-konnectivity-ca\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.171979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-registration-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-etc-selinux\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-conf\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-registration-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-etc-selinux\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-env-overrides\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b19ab41-5f62-4594-9262-8789718fb9e9-serviceca\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysconfig\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-host\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-k8s-cni-cncf-io\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-konnectivity-ca\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.172571 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d301008-567f-46b2-9fbb-f3d7ea2a6456-iptables-alerter-script\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftq9c\" (UniqueName: \"kubernetes.io/projected/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-kube-api-access-ftq9c\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysconfig\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-k8s-cni-cncf-io\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b19ab41-5f62-4594-9262-8789718fb9e9-serviceca\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cnibin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-netd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovn-node-metrics-cert\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-netd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cnibin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-tmp\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-multus\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.173352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-sysctl-d\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-multus\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9smtg\" (UniqueName: \"kubernetes.io/projected/7d301008-567f-46b2-9fbb-f3d7ea2a6456-kube-api-access-9smtg\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-slash\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-systemd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-slash\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.172987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-system-cni-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-systemd\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-etc-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-system-cni-dir\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173091 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-etc-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-systemd\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-sys\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.173198 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-conf-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-systemd\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-multus-certs\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzn9\" (UniqueName: \"kubernetes.io/projected/3b19ab41-5f62-4594-9262-8789718fb9e9-kube-api-access-hjzn9\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-sys\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.173302 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:32.673254754 +0000 UTC m=+3.108798902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-conf-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-multus-certs\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-os-release\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-device-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgpp\" (UniqueName: \"kubernetes.io/projected/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kube-api-access-fzgpp\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-run\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-os-release\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-run\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-device-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.174935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-tuned\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cni-binary-copy\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-agent-certs\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d301008-567f-46b2-9fbb-f3d7ea2a6456-host-slash\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d301008-567f-46b2-9fbb-f3d7ea2a6456-host-slash\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-log-socket\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-kubernetes\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-daemon-config\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-systemd-units\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-netns\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-script-lib\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-kubernetes\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cnibin\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cnibin\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-lib-modules\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-bin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.173993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.175758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-bin\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b19ab41-5f62-4594-9262-8789718fb9e9-host\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-openvswitch\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-cni-bin\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-lib-modules\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-var-lib-cni-bin\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b19ab41-5f62-4594-9262-8789718fb9e9-host\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfkn6\" (UniqueName: \"kubernetes.io/projected/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-kube-api-access-jfkn6\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-systemd-units\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jjm\" (UniqueName: \"kubernetes.io/projected/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-kube-api-access-j4jjm\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-host-run-netns\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-system-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-system-cni-dir\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-log-socket\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.176651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bv86\" (UniqueName: \"kubernetes.io/projected/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-kube-api-access-9bv86\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-ovn\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-socket-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-sys-fs\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-run-ovn\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-sys-fs\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-netns\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-hostroot\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-etc-kubernetes\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-socket-dir\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xcfk\" (UniqueName: \"kubernetes.io/projected/5987592a-660d-4466-bbc4-5bd812cca838-kube-api-access-8xcfk\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-host-run-netns\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-cni-binary-copy\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-script-lib\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-hostroot\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-etc-kubernetes\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-node-log\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-config\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.177479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.174990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-node-log\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.175273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-multus-daemon-config\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.175624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovnkube-config\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.177003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-tmp\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.177164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-ovn-node-metrics-cert\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.177504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dfd40e78-07ee-46ff-90d1-6f4a6d4baa55-agent-certs\") pod \"konnectivity-agent-kztg4\" (UID: \"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55\") " pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.178031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.177636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-etc-tuned\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.183303 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.183261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftq9c\" (UniqueName: \"kubernetes.io/projected/65dfd729-42e3-45f2-8f76-eec9fa62c8c4-kube-api-access-ftq9c\") pod \"ovnkube-node-x9pxn\" (UID: \"65dfd729-42e3-45f2-8f76-eec9fa62c8c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.183472 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.183452 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:32.183540 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.183478 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:32.183540 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.183493 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:32.183728 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.183566 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:32.683547261 +0000 UTC m=+3.119091401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:32.184801 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.184776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jjm\" (UniqueName: \"kubernetes.io/projected/16ae23d8-9ea1-44ed-8bab-f54febfa4bc6-kube-api-access-j4jjm\") pod \"tuned-9jslm\" (UID: \"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.185417 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.185373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xcfk\" (UniqueName: \"kubernetes.io/projected/5987592a-660d-4466-bbc4-5bd812cca838-kube-api-access-8xcfk\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.185716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.185641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzn9\" (UniqueName: \"kubernetes.io/projected/3b19ab41-5f62-4594-9262-8789718fb9e9-kube-api-access-hjzn9\") pod \"node-ca-7vhjb\" (UID: \"3b19ab41-5f62-4594-9262-8789718fb9e9\") " pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.185896 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.185864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9smtg\" (UniqueName: \"kubernetes.io/projected/7d301008-567f-46b2-9fbb-f3d7ea2a6456-kube-api-access-9smtg\") pod \"iptables-alerter-f6vw2\" (UID: \"7d301008-567f-46b2-9fbb-f3d7ea2a6456\") " pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.186178 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.186155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bv86\" (UniqueName: \"kubernetes.io/projected/f68a7ade-f858-4f5d-b2ac-1ea4270c1737-kube-api-access-9bv86\") pod \"multus-mm98x\" (UID: \"f68a7ade-f858-4f5d-b2ac-1ea4270c1737\") " pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.186395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.186339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfkn6\" (UniqueName: \"kubernetes.io/projected/a42bcb4e-3ab0-49cb-8302-c0f2152edc3d-kube-api-access-jfkn6\") pod \"multus-additional-cni-plugins-cx4lv\" (UID: \"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d\") " pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.188033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.188012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgpp\" (UniqueName: \"kubernetes.io/projected/9f4afccb-d2c5-4658-8e04-1c1e51f934ee-kube-api-access-fzgpp\") pod \"aws-ebs-csi-driver-node-dkth8\" (UID: \"9f4afccb-d2c5-4658-8e04-1c1e51f934ee\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.354386 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.354270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" Apr 20 15:02:32.362351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.362322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vhjb" Apr 20 15:02:32.371065 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.371031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" Apr 20 15:02:32.377885 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.377863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9jslm" Apr 20 15:02:32.385500 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.385472 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mm98x" Apr 20 15:02:32.394376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.394342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f6vw2" Apr 20 15:02:32.402136 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.402106 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:32.407880 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.407854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:32.678987 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.678895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:32.679147 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.679019 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.679147 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.679089 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:33.679069997 +0000 UTC m=+4.114614133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.779204 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:32.779167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:32.779408 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.779382 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:32.779408 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.779410 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:32.779559 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.779425 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:32.779559 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:32.779488 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:33.779467764 +0000 UTC m=+4.215011914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:32.779809 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.779701 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68a7ade_f858_4f5d_b2ac_1ea4270c1737.slice/crio-cf00b653caeb1bc40ac47c668e442e92b160a54d3ded8979e5855436eaea2d48 WatchSource:0}: Error finding container cf00b653caeb1bc40ac47c668e442e92b160a54d3ded8979e5855436eaea2d48: Status 404 returned error can't find the container with id cf00b653caeb1bc40ac47c668e442e92b160a54d3ded8979e5855436eaea2d48 Apr 20 15:02:32.780466 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.780441 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd40e78_07ee_46ff_90d1_6f4a6d4baa55.slice/crio-c2d1d8b0aa1b11b8446e4d73862e56245463a6a75898cebe7f00457b77759bd3 WatchSource:0}: Error finding container c2d1d8b0aa1b11b8446e4d73862e56245463a6a75898cebe7f00457b77759bd3: Status 404 returned error can't find the container with id c2d1d8b0aa1b11b8446e4d73862e56245463a6a75898cebe7f00457b77759bd3 Apr 20 15:02:32.783416 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.783392 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42bcb4e_3ab0_49cb_8302_c0f2152edc3d.slice/crio-909070c920f244e97d67b58b7f6a0c0bcdb3952140b195e96237fae2998dc9a4 WatchSource:0}: Error finding container 909070c920f244e97d67b58b7f6a0c0bcdb3952140b195e96237fae2998dc9a4: Status 404 returned error can't find the container with id 909070c920f244e97d67b58b7f6a0c0bcdb3952140b195e96237fae2998dc9a4 Apr 20 15:02:32.784923 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.784900 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65dfd729_42e3_45f2_8f76_eec9fa62c8c4.slice/crio-044069b54ded36fc837effd239d3937160110f8a9210bfbb25765417ff2c9278 WatchSource:0}: Error finding container 044069b54ded36fc837effd239d3937160110f8a9210bfbb25765417ff2c9278: Status 404 returned error can't find the container with id 044069b54ded36fc837effd239d3937160110f8a9210bfbb25765417ff2c9278 Apr 20 15:02:32.785511 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.785491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f4afccb_d2c5_4658_8e04_1c1e51f934ee.slice/crio-9c2e25b59bdee481c9beb8883f407252361cbc763db8b9f7a6a1647e196e9afd WatchSource:0}: Error finding container 9c2e25b59bdee481c9beb8883f407252361cbc763db8b9f7a6a1647e196e9afd: Status 404 returned error can't find the container with id 9c2e25b59bdee481c9beb8883f407252361cbc763db8b9f7a6a1647e196e9afd Apr 20 15:02:32.786621 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.786590 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d301008_567f_46b2_9fbb_f3d7ea2a6456.slice/crio-51a83d3216a645128de8468adb5cfbee385bda13f1761a3c215a0f9576bd02f9 WatchSource:0}: Error finding container 51a83d3216a645128de8468adb5cfbee385bda13f1761a3c215a0f9576bd02f9: Status 404 returned error can't find the container with id 51a83d3216a645128de8468adb5cfbee385bda13f1761a3c215a0f9576bd02f9 Apr 20 15:02:32.787564 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.787450 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b19ab41_5f62_4594_9262_8789718fb9e9.slice/crio-374d5cfe23011783b337324d35dbc648768d84e7ddd925e4bd931b3b7da0744c WatchSource:0}: Error finding container 374d5cfe23011783b337324d35dbc648768d84e7ddd925e4bd931b3b7da0744c: Status 404 returned error can't find the container with id 374d5cfe23011783b337324d35dbc648768d84e7ddd925e4bd931b3b7da0744c Apr 20 15:02:32.788788 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:32.788759 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ae23d8_9ea1_44ed_8bab_f54febfa4bc6.slice/crio-cdc226ab43e9e8d21c714825e81a5ea7f4969de524884dd2870e98eb75ba502a WatchSource:0}: Error finding container cdc226ab43e9e8d21c714825e81a5ea7f4969de524884dd2870e98eb75ba502a: Status 404 returned error can't find the container with id cdc226ab43e9e8d21c714825e81a5ea7f4969de524884dd2870e98eb75ba502a Apr 20 15:02:33.095983 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.095941 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:31 +0000 UTC" deadline="2027-12-05 04:39:05.212920406 +0000 UTC" Apr 20 15:02:33.095983 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.095975 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14245h36m32.116947803s" Apr 20 15:02:33.191886 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.191238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:33.191886 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.191383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:33.203645 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.203583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" event={"ID":"6ec3cd5b714ec0026f823b211f468c89","Type":"ContainerStarted","Data":"f0699ca2641bc74056fc9bf874c408cc73fc7a410621303da5ac30f4ae1f69df"} Apr 20 15:02:33.205616 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.205558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vhjb" event={"ID":"3b19ab41-5f62-4594-9262-8789718fb9e9","Type":"ContainerStarted","Data":"374d5cfe23011783b337324d35dbc648768d84e7ddd925e4bd931b3b7da0744c"} Apr 20 15:02:33.209190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.209137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f6vw2" event={"ID":"7d301008-567f-46b2-9fbb-f3d7ea2a6456","Type":"ContainerStarted","Data":"51a83d3216a645128de8468adb5cfbee385bda13f1761a3c215a0f9576bd02f9"} Apr 20 15:02:33.214756 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.214701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" event={"ID":"9f4afccb-d2c5-4658-8e04-1c1e51f934ee","Type":"ContainerStarted","Data":"9c2e25b59bdee481c9beb8883f407252361cbc763db8b9f7a6a1647e196e9afd"} Apr 20 15:02:33.215670 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.214995 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-115.ec2.internal" podStartSLOduration=2.214978727 podStartE2EDuration="2.214978727s" podCreationTimestamp="2026-04-20 15:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:33.214848167 +0000 UTC m=+3.650392322" watchObservedRunningTime="2026-04-20 15:02:33.214978727 +0000 UTC m=+3.650522864" Apr 20 15:02:33.219375 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.219335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm98x" event={"ID":"f68a7ade-f858-4f5d-b2ac-1ea4270c1737","Type":"ContainerStarted","Data":"cf00b653caeb1bc40ac47c668e442e92b160a54d3ded8979e5855436eaea2d48"} Apr 20 15:02:33.220784 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.220755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"044069b54ded36fc837effd239d3937160110f8a9210bfbb25765417ff2c9278"} Apr 20 15:02:33.223891 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.223864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9jslm" event={"ID":"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6","Type":"ContainerStarted","Data":"cdc226ab43e9e8d21c714825e81a5ea7f4969de524884dd2870e98eb75ba502a"} Apr 20 15:02:33.226946 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.226731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerStarted","Data":"909070c920f244e97d67b58b7f6a0c0bcdb3952140b195e96237fae2998dc9a4"} Apr 20 15:02:33.229617 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.229590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kztg4" event={"ID":"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55","Type":"ContainerStarted","Data":"c2d1d8b0aa1b11b8446e4d73862e56245463a6a75898cebe7f00457b77759bd3"} Apr 20 15:02:33.686753 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.686713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:33.686950 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.686894 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:33.687011 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.686961 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:35.686941833 +0000 UTC m=+6.122485978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:33.787380 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:33.787333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:33.787623 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.787516 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:33.787623 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.787541 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:33.787623 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.787555 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:33.787623 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:33.787607 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:35.787587654 +0000 UTC m=+6.223131800 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:34.196792 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:34.196229 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:34.196792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:34.196393 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:34.242474 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:34.241346 2577 generic.go:358] "Generic (PLEG): container finished" podID="bc6bbe5a7166ac8cdce602f3f8f7acd2" containerID="10168e827df19c14629df252ebc4fe28b46faa1517d0bd5d86e8f4e29364a709" exitCode=0 Apr 20 15:02:34.242474 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:34.242346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" event={"ID":"bc6bbe5a7166ac8cdce602f3f8f7acd2","Type":"ContainerDied","Data":"10168e827df19c14629df252ebc4fe28b46faa1517d0bd5d86e8f4e29364a709"} Apr 20 15:02:35.191841 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:35.191200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:35.191841 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.191385 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:35.256992 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:35.256946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" event={"ID":"bc6bbe5a7166ac8cdce602f3f8f7acd2","Type":"ContainerStarted","Data":"75c2568b02409c68e6ce1dee1fe075ec0c06e3680fe57eb2a4b55085cf28ed80"} Apr 20 15:02:35.270843 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:35.270560 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-115.ec2.internal" podStartSLOduration=4.270537497 podStartE2EDuration="4.270537497s" podCreationTimestamp="2026-04-20 15:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:35.26978736 +0000 UTC m=+5.705331508" watchObservedRunningTime="2026-04-20 15:02:35.270537497 +0000 UTC m=+5.706081652" Apr 20 15:02:35.704957 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:35.704919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:35.705133 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.705109 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:35.705205 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.705180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:39.705161996 +0000 UTC m=+10.140706133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:35.806081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:35.805993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:35.806255 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.806210 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:35.806255 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.806239 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:35.806255 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.806252 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:35.806451 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:35.806335 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:39.806314268 +0000 UTC m=+10.241858401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:36.191817 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:36.191248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:36.191817 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:36.191444 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:37.191227 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:37.191186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:37.191690 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:37.191444 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:38.190911 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:38.190873 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:38.191109 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:38.191029 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:39.191104 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:39.191063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:39.191573 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.191295 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:39.738553 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:39.738487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:39.738750 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.738644 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:39.738750 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.738714 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:47.738693939 +0000 UTC m=+18.174238072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:39.839013 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:39.838963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:39.839176 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.839167 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:39.839221 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.839186 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:39.839221 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.839199 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:39.839338 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:39.839260 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:47.839240618 +0000 UTC m=+18.274784757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:40.195368 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.195037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:40.195368 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:40.195168 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:40.793837 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.793800 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hc5nc"] Apr 20 15:02:40.796086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.796056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.798368 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.798347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wt7fq\"" Apr 20 15:02:40.798519 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.798397 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 15:02:40.798599 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.798585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 15:02:40.848333 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.848261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/423472ab-8267-4409-9aa0-f1d4a9c14e79-tmp-dir\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.848530 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.848342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rdc5\" (UniqueName: \"kubernetes.io/projected/423472ab-8267-4409-9aa0-f1d4a9c14e79-kube-api-access-8rdc5\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.848530 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.848378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/423472ab-8267-4409-9aa0-f1d4a9c14e79-hosts-file\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.949803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.949763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/423472ab-8267-4409-9aa0-f1d4a9c14e79-tmp-dir\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.949979 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.949815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rdc5\" (UniqueName: \"kubernetes.io/projected/423472ab-8267-4409-9aa0-f1d4a9c14e79-kube-api-access-8rdc5\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.949979 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.949848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/423472ab-8267-4409-9aa0-f1d4a9c14e79-hosts-file\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.949979 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.949942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/423472ab-8267-4409-9aa0-f1d4a9c14e79-hosts-file\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.950313 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.950270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/423472ab-8267-4409-9aa0-f1d4a9c14e79-tmp-dir\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:40.960505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:40.960466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rdc5\" (UniqueName: \"kubernetes.io/projected/423472ab-8267-4409-9aa0-f1d4a9c14e79-kube-api-access-8rdc5\") pod \"node-resolver-hc5nc\" (UID: \"423472ab-8267-4409-9aa0-f1d4a9c14e79\") " pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:41.108251 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:41.108150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc5nc" Apr 20 15:02:41.190824 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:41.190780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:41.191010 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:41.190921 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:42.191038 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:42.190995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:42.191513 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:42.191146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:43.191388 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:43.191342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:43.191815 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:43.191475 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:44.193619 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:44.193588 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:44.194074 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:44.193712 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:45.191019 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:45.190979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:45.191188 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:45.191113 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:46.191758 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:46.191717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:46.192235 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:46.191854 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:47.190778 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:47.190689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:47.190959 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.190822 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:47.799515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:47.799471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:47.799949 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.799597 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:47.799949 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.799668 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.799650417 +0000 UTC m=+34.235194566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:47.900296 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:47.900248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:47.900487 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.900420 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:47.900487 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.900447 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:47.900487 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.900458 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:47.900620 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:47.900513 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.900497318 +0000 UTC m=+34.336041450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:48.191459 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:48.191363 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:48.191637 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:48.191503 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:49.190762 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:49.190723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:49.191224 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:49.190825 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:49.661655 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:02:49.661620 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423472ab_8267_4409_9aa0_f1d4a9c14e79.slice/crio-4cf33296b41c01231529fa834538179c2044ef9a5434c3116bbf426935f68247 WatchSource:0}: Error finding container 4cf33296b41c01231529fa834538179c2044ef9a5434c3116bbf426935f68247: Status 404 returned error can't find the container with id 4cf33296b41c01231529fa834538179c2044ef9a5434c3116bbf426935f68247 Apr 20 15:02:50.191895 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.191714 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:50.192392 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:50.191965 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:50.285195 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.284990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" event={"ID":"9f4afccb-d2c5-4658-8e04-1c1e51f934ee","Type":"ContainerStarted","Data":"48e2358ce95929327415eb79d9d82b8a427e6c4c5f83dc75056a6a3db4a3b301"} Apr 20 15:02:50.286670 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.286630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mm98x" event={"ID":"f68a7ade-f858-4f5d-b2ac-1ea4270c1737","Type":"ContainerStarted","Data":"7ed94974ed1865427b78edc3d636c621cb8c651bc17112738d1d333e4f38580f"} Apr 20 15:02:50.288731 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.288710 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:02:50.289095 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.289072 2577 generic.go:358] "Generic (PLEG): container finished" podID="65dfd729-42e3-45f2-8f76-eec9fa62c8c4" containerID="52410a9e3cd7b01138a2d38a8de2e53299fab0e1c9f00854c38e436027818de0" exitCode=1 Apr 20 15:02:50.289189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.289141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"b7214c7c648a1ffec2692ddfced6d52606cf0dd144f5653d0e4206afd6f8b5b1"} Apr 20 15:02:50.289189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.289178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"29eb2fdc06a08adf7d59d5062b4a880df9f07d78b6c55e129c3d162a8be90db1"} Apr 20 15:02:50.289319 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.289201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerDied","Data":"52410a9e3cd7b01138a2d38a8de2e53299fab0e1c9f00854c38e436027818de0"} Apr 20 15:02:50.289319 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.289219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"6f574d4f0632e600f4745fb7f83a92511d4c4ce3e2cfe303a87f3f2d7be17eed"} Apr 20 15:02:50.292669 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.292646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9jslm" event={"ID":"16ae23d8-9ea1-44ed-8bab-f54febfa4bc6","Type":"ContainerStarted","Data":"617b34b780bc15c6bd67a9caea584b4897e21867c84afa0ea9269d402f71e0d3"} Apr 20 15:02:50.294298 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.294261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerStarted","Data":"9daf3b27734e80ba19f4bc5d9ebbfe2ff0e21d39cf7afec86b5b1bb456526bae"} Apr 20 15:02:50.295586 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.295560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kztg4" event={"ID":"dfd40e78-07ee-46ff-90d1-6f4a6d4baa55","Type":"ContainerStarted","Data":"13176a92b6d2b0ed454ef1b4f877407942e8af7f474330817de329bf939e498b"} Apr 20 15:02:50.296860 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.296834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc5nc" event={"ID":"423472ab-8267-4409-9aa0-f1d4a9c14e79","Type":"ContainerStarted","Data":"610f094166e439954153b58cbd165e15a033b2454340b79a7a4f915bf4a3a87e"} Apr 20 15:02:50.296955 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.296869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc5nc" event={"ID":"423472ab-8267-4409-9aa0-f1d4a9c14e79","Type":"ContainerStarted","Data":"4cf33296b41c01231529fa834538179c2044ef9a5434c3116bbf426935f68247"} Apr 20 15:02:50.298188 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.298167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vhjb" event={"ID":"3b19ab41-5f62-4594-9262-8789718fb9e9","Type":"ContainerStarted","Data":"ee02ac79cce1ec92c812b0a916fb2424386f9a8f09675df4f68388086b32bf22"} Apr 20 15:02:50.303105 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.303066 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mm98x" podStartSLOduration=3.398452368 podStartE2EDuration="20.303056455s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.781386719 +0000 UTC m=+3.216930852" lastFinishedPulling="2026-04-20 15:02:49.685990804 +0000 UTC m=+20.121534939" observedRunningTime="2026-04-20 15:02:50.302882127 +0000 UTC m=+20.738426280" watchObservedRunningTime="2026-04-20 15:02:50.303056455 +0000 UTC m=+20.738600608" Apr 20 15:02:50.321071 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.321016 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7vhjb" podStartSLOduration=3.752347682 podStartE2EDuration="20.321000774s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.790131371 +0000 UTC m=+3.225675505" lastFinishedPulling="2026-04-20 15:02:49.358784462 +0000 UTC m=+19.794328597" observedRunningTime="2026-04-20 15:02:50.319825676 +0000 UTC m=+20.755369831" watchObservedRunningTime="2026-04-20 15:02:50.321000774 +0000 UTC m=+20.756544928" Apr 20 15:02:50.376733 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.376686 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9jslm" podStartSLOduration=3.509833004 podStartE2EDuration="20.37667123s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.791098238 +0000 UTC m=+3.226642374" lastFinishedPulling="2026-04-20 15:02:49.657936462 +0000 UTC m=+20.093480600" observedRunningTime="2026-04-20 15:02:50.360751622 +0000 UTC m=+20.796295775" watchObservedRunningTime="2026-04-20 15:02:50.37667123 +0000 UTC m=+20.812215383" Apr 20 15:02:50.392901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.392858 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kztg4" podStartSLOduration=8.21335106 podStartE2EDuration="20.392840231s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.782414916 +0000 UTC m=+3.217959063" lastFinishedPulling="2026-04-20 15:02:44.961904088 +0000 UTC m=+15.397448234" observedRunningTime="2026-04-20 15:02:50.376470084 +0000 UTC m=+20.812014240" watchObservedRunningTime="2026-04-20 15:02:50.392840231 +0000 UTC m=+20.828384444" Apr 20 15:02:50.393118 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:50.393100 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hc5nc" podStartSLOduration=10.393095441 podStartE2EDuration="10.393095441s" podCreationTimestamp="2026-04-20 15:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:50.392722495 +0000 UTC m=+20.828266648" watchObservedRunningTime="2026-04-20 15:02:50.393095441 +0000 UTC m=+20.828639594" Apr 20 15:02:51.190949 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.190924 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:51.191114 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:51.191025 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:51.252545 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.252515 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 15:02:51.302081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.301987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f6vw2" event={"ID":"7d301008-567f-46b2-9fbb-f3d7ea2a6456","Type":"ContainerStarted","Data":"87097406962eb30796a1208e03b121b612c7676a2714913d079e6ba071c714a9"} Apr 20 15:02:51.303897 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.303865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" event={"ID":"9f4afccb-d2c5-4658-8e04-1c1e51f934ee","Type":"ContainerStarted","Data":"d61ccdb9dd97459984f0a791d6db187321cd78ec1344c326a3b48a0148c82038"} Apr 20 15:02:51.306790 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.306770 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:02:51.307170 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.307139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"d086180b29890a7d70ea30c79d186fad8de3978a933af0dadf1126bd53f32dd9"} Apr 20 15:02:51.307253 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.307175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"c0c77d53ee85e32c89fad220677fbdc91c35170a681c39cadf36938023b948d8"} Apr 20 15:02:51.308611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.308590 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="9daf3b27734e80ba19f4bc5d9ebbfe2ff0e21d39cf7afec86b5b1bb456526bae" exitCode=0 Apr 20 15:02:51.308706 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.308681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"9daf3b27734e80ba19f4bc5d9ebbfe2ff0e21d39cf7afec86b5b1bb456526bae"} Apr 20 15:02:51.316099 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:51.316057 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f6vw2" podStartSLOduration=4.4715232 podStartE2EDuration="21.316040899s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.788534979 +0000 UTC m=+3.224079126" lastFinishedPulling="2026-04-20 15:02:49.633052688 +0000 UTC m=+20.068596825" observedRunningTime="2026-04-20 15:02:51.31601137 +0000 UTC m=+21.751555525" watchObservedRunningTime="2026-04-20 15:02:51.316040899 +0000 UTC m=+21.751585069" Apr 20 15:02:52.134792 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.134676 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T15:02:51.252536016Z","UUID":"0f58114b-d60c-4504-b1fe-f92bdedff7c7","Handler":null,"Name":"","Endpoint":""} Apr 20 15:02:52.136589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.136554 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 15:02:52.136589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.136591 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 15:02:52.190957 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.190908 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:52.191162 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:52.191045 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:52.435658 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.435463 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:52.436189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:52.436167 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:53.191269 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.191240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:53.191477 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:53.191403 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:53.314366 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.314324 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" event={"ID":"9f4afccb-d2c5-4658-8e04-1c1e51f934ee","Type":"ContainerStarted","Data":"6c6c4e05b54632cc565ee4da11dc96ed25ecff6cea7c8936a3d097d75cbab3cb"} Apr 20 15:02:53.317577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.317549 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:02:53.317990 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.317964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"57af8611877dc30f5150b55b65c0467609be59241fcbe6770ace3bc042e2e882"} Apr 20 15:02:53.318193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.318175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:53.318741 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.318714 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kztg4" Apr 20 15:02:53.332421 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:53.332368 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dkth8" podStartSLOduration=3.837427963 podStartE2EDuration="23.332350605s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.788545529 +0000 UTC m=+3.224089668" lastFinishedPulling="2026-04-20 15:02:52.283468164 +0000 UTC m=+22.719012310" observedRunningTime="2026-04-20 15:02:53.331861933 +0000 UTC m=+23.767406078" watchObservedRunningTime="2026-04-20 15:02:53.332350605 +0000 UTC m=+23.767894761" Apr 20 15:02:54.191372 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:54.191332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:54.191986 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:54.191496 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:55.191302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.191013 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:55.191464 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:55.191300 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:55.325300 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.325270 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:02:55.325658 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.325632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"ec8858e0f5196b7a92928117edd0bb7a563c9bd854612380c8a97d7cdeee6adf"} Apr 20 15:02:55.325990 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.325951 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:55.326196 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.326182 2577 scope.go:117] "RemoveContainer" containerID="52410a9e3cd7b01138a2d38a8de2e53299fab0e1c9f00854c38e436027818de0" Apr 20 15:02:55.327314 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.327273 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="688db42226502b19256b8273c038b8249b5b9a550bbe8d39dcd727f2396efe59" exitCode=0 Apr 20 15:02:55.327459 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.327386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"688db42226502b19256b8273c038b8249b5b9a550bbe8d39dcd727f2396efe59"} Apr 20 15:02:55.341988 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:55.341963 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:56.191797 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.191771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:56.192174 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:56.191894 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:56.333204 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.333114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:02:56.333524 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.333487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" event={"ID":"65dfd729-42e3-45f2-8f76-eec9fa62c8c4","Type":"ContainerStarted","Data":"be7c40f757ac4ed2e948f06ad7262bcc6698b08d4a6e2e8454dd43a4003e9844"} Apr 20 15:02:56.333865 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.333848 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:56.333957 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.333875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:56.335755 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.335731 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="8e01f65698f3b877aa2a15f29044488cd1e218f8478b282d00acf84f72af5b72" exitCode=0 Apr 20 15:02:56.335839 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.335768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"8e01f65698f3b877aa2a15f29044488cd1e218f8478b282d00acf84f72af5b72"} Apr 20 15:02:56.350175 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.350150 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:02:56.363203 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.363155 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" podStartSLOduration=9.438130671 podStartE2EDuration="26.363140994s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.786656501 +0000 UTC m=+3.222200633" lastFinishedPulling="2026-04-20 15:02:49.711666819 +0000 UTC m=+20.147210956" observedRunningTime="2026-04-20 15:02:56.361902363 +0000 UTC m=+26.797446517" watchObservedRunningTime="2026-04-20 15:02:56.363140994 +0000 UTC m=+26.798685147" Apr 20 15:02:56.717716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.717682 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z8nvj"] Apr 20 15:02:56.717912 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.717830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:56.717977 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:56.717919 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:56.720743 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.720716 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dp887"] Apr 20 15:02:56.720871 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:56.720860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:56.721012 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:56.720986 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:02:57.340174 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:57.339841 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="c1eeeaf68866c77c92320209ce17ce66c503a21c6f47336a5d195b3dd2873eda" exitCode=0 Apr 20 15:02:57.340546 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:57.339925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"c1eeeaf68866c77c92320209ce17ce66c503a21c6f47336a5d195b3dd2873eda"} Apr 20 15:02:58.191425 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:58.191387 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:02:58.191623 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:02:58.191463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:02:58.191623 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:58.191579 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:02:58.191735 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:02:58.191707 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:03:00.191820 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:00.191791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:03:00.192394 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:00.191900 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:03:00.192394 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:00.191970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:00.192394 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:00.192052 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:03:02.191635 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.191602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:03:02.192222 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:02.191735 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:03:02.192222 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.191792 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:02.192222 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:02.191885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z8nvj" podUID="0bee40ff-47b7-46ba-adb1-4493194f0ff8" Apr 20 15:03:02.899612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.899534 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-115.ec2.internal" event="NodeReady" Apr 20 15:03:02.899773 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.899700 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 15:03:02.938659 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.938630 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rwbpz"] Apr 20 15:03:02.950608 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.950577 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7rhcs"] Apr 20 15:03:02.950755 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.950730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:02.953060 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.953037 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:03:02.953190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.953091 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 15:03:02.953394 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.953372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 15:03:02.962052 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.962027 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwbpz"] Apr 20 15:03:02.962144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.962062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7rhcs"] Apr 20 15:03:02.962182 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.962164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:02.964597 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.964566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:03:02.964719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.964601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 15:03:02.964719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.964566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 15:03:02.964719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:02.964644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 15:03:03.113598 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbk88\" (UniqueName: \"kubernetes.io/projected/cbcbf670-2941-48c2-8a4a-b5f253135d10-kube-api-access-cbk88\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.113765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.113765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/548662da-3b01-418f-b71e-7805525a03e5-config-volume\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.113765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gcx\" (UniqueName: \"kubernetes.io/projected/548662da-3b01-418f-b71e-7805525a03e5-kube-api-access-z8gcx\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.113874 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.113874 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.113790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/548662da-3b01-418f-b71e-7805525a03e5-tmp-dir\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.219235 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gcx\" (UniqueName: \"kubernetes.io/projected/548662da-3b01-418f-b71e-7805525a03e5-kube-api-access-z8gcx\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/548662da-3b01-418f-b71e-7805525a03e5-tmp-dir\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbk88\" (UniqueName: \"kubernetes.io/projected/cbcbf670-2941-48c2-8a4a-b5f253135d10-kube-api-access-cbk88\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.219419 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/548662da-3b01-418f-b71e-7805525a03e5-config-volume\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.219490 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.71946858 +0000 UTC m=+34.155012727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.219543 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.219588 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.719573588 +0000 UTC m=+34.155117734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:03.219996 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.219755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/548662da-3b01-418f-b71e-7805525a03e5-tmp-dir\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.228546 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.228517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gcx\" (UniqueName: \"kubernetes.io/projected/548662da-3b01-418f-b71e-7805525a03e5-kube-api-access-z8gcx\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.228662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.228629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbk88\" (UniqueName: \"kubernetes.io/projected/cbcbf670-2941-48c2-8a4a-b5f253135d10-kube-api-access-cbk88\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.231333 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.231304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/548662da-3b01-418f-b71e-7805525a03e5-config-volume\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.354731 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.354690 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="19f2da77a4e8356443759a107dcbf2f0cd06116f4a9c2695b58569d1ed14273f" exitCode=0 Apr 20 15:03:03.354932 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.354761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"19f2da77a4e8356443759a107dcbf2f0cd06116f4a9c2695b58569d1ed14273f"} Apr 20 15:03:03.724245 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.724209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:03.724505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.724310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:03.724505 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.724377 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:03.724505 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.724419 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:03.724505 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.724440 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:04.724424948 +0000 UTC m=+35.159969080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:03.724505 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.724473 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:04.724460305 +0000 UTC m=+35.160004437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:03.824670 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.824623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:03:03.824928 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.824776 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:03:03.824928 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.824846 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:35.824831239 +0000 UTC m=+66.260375375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:03:03.925869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:03.925833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:03.926006 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.925986 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:03:03.926006 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.926000 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:03:03.926079 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.926009 2577 projected.go:194] Error preparing data for projected volume kube-api-access-csldl for pod openshift-network-diagnostics/network-check-target-z8nvj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:03:03.926079 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:03.926059 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl podName:0bee40ff-47b7-46ba-adb1-4493194f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:35.926046019 +0000 UTC m=+66.361590150 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-csldl" (UniqueName: "kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl") pod "network-check-target-z8nvj" (UID: "0bee40ff-47b7-46ba-adb1-4493194f0ff8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:03:04.191341 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.191234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:04.191520 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.191430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:03:04.194000 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.193975 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 15:03:04.194000 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.193974 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 15:03:04.194192 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.194016 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:03:04.194192 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.193979 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dhzk6\"" Apr 20 15:03:04.194607 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.194593 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 15:03:04.359345 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.359308 2577 generic.go:358] "Generic (PLEG): container finished" podID="a42bcb4e-3ab0-49cb-8302-c0f2152edc3d" containerID="8d8aacc56543751781a89f9a8d4969a99292a45a4518c0fab6488564b6e1c68e" exitCode=0 Apr 20 15:03:04.359733 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.359360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerDied","Data":"8d8aacc56543751781a89f9a8d4969a99292a45a4518c0fab6488564b6e1c68e"} Apr 20 15:03:04.731120 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.730896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:04.731342 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:04.731050 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:04.731342 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:04.731254 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:04.731342 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:04.731185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:04.731342 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:04.731259 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:06.731237945 +0000 UTC m=+37.166782099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:04.731342 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:04.731335 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:06.731314501 +0000 UTC m=+37.166858636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:05.364456 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:05.364420 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" event={"ID":"a42bcb4e-3ab0-49cb-8302-c0f2152edc3d","Type":"ContainerStarted","Data":"09abdb607b1bc78be8042b39fe8a77904bd26a6823fd37bc4610027c7ec4b29c"} Apr 20 15:03:05.388423 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:05.388368 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cx4lv" podStartSLOduration=5.237242002 podStartE2EDuration="35.388351695s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.785053915 +0000 UTC m=+3.220598051" lastFinishedPulling="2026-04-20 15:03:02.936163613 +0000 UTC m=+33.371707744" observedRunningTime="2026-04-20 15:03:05.386773106 +0000 UTC m=+35.822317260" watchObservedRunningTime="2026-04-20 15:03:05.388351695 +0000 UTC m=+35.823895850" Apr 20 15:03:06.745729 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:06.745680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:06.746133 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:06.745773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:06.746133 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:06.745832 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:06.746133 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:06.745909 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:10.745893012 +0000 UTC m=+41.181437143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:06.746133 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:06.745927 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:06.746133 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:06.745998 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:10.74597909 +0000 UTC m=+41.181523242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:10.773704 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:10.773665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:10.774167 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:10.773761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:10.774167 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:10.773832 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:10.774167 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:10.773903 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:10.774167 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:10.773969 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:18.773950903 +0000 UTC m=+49.209495036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:10.774167 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:10.773990 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:18.773983759 +0000 UTC m=+49.209527891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:18.828347 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:18.828304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:18.828347 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:18.828363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:18.828866 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:18.828493 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:18.828866 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:18.828577 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:34.828559101 +0000 UTC m=+65.264103233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:18.828866 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:18.828580 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:18.828866 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:18.828636 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:34.828620812 +0000 UTC m=+65.264164961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:28.352765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:28.352732 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9pxn" Apr 20 15:03:34.837391 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:34.837343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:03:34.837792 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:34.837438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:03:34.837792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:34.837519 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:34.837792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:34.837579 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:06.837564701 +0000 UTC m=+97.273108833 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:34.837792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:34.837520 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:34.837792 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:34.837656 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:06.837643635 +0000 UTC m=+97.273187783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:03:35.842415 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.842375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:03:35.845093 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.845070 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 15:03:35.853176 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:35.853151 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:03:35.853312 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:03:35.853227 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:39.853204359 +0000 UTC m=+130.288748505 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : secret "metrics-daemon-secret" not found Apr 20 15:03:35.943116 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.943083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:35.945772 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.945753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 15:03:35.956445 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.956422 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 15:03:35.972097 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:35.972068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csldl\" (UniqueName: \"kubernetes.io/projected/0bee40ff-47b7-46ba-adb1-4493194f0ff8-kube-api-access-csldl\") pod \"network-check-target-z8nvj\" (UID: \"0bee40ff-47b7-46ba-adb1-4493194f0ff8\") " pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:36.005623 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:36.005592 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dhzk6\"" Apr 20 15:03:36.011831 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:36.011810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:36.137611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:36.137577 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z8nvj"] Apr 20 15:03:36.141005 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:03:36.140961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bee40ff_47b7_46ba_adb1_4493194f0ff8.slice/crio-20217e4eba91722c88e9bd72334a083202d90b325c540ddae7f92b09221bf3ea WatchSource:0}: Error finding container 20217e4eba91722c88e9bd72334a083202d90b325c540ddae7f92b09221bf3ea: Status 404 returned error can't find the container with id 20217e4eba91722c88e9bd72334a083202d90b325c540ddae7f92b09221bf3ea Apr 20 15:03:36.423088 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:36.423002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z8nvj" event={"ID":"0bee40ff-47b7-46ba-adb1-4493194f0ff8","Type":"ContainerStarted","Data":"20217e4eba91722c88e9bd72334a083202d90b325c540ddae7f92b09221bf3ea"} Apr 20 15:03:39.432176 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:39.432137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z8nvj" event={"ID":"0bee40ff-47b7-46ba-adb1-4493194f0ff8","Type":"ContainerStarted","Data":"7226e4aecf007febde6b203a2433161afb1417c1a00133dd4076cb77a7536661"} Apr 20 15:03:39.432598 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:39.432265 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:03:39.448854 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:03:39.448798 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z8nvj" podStartSLOduration=66.878793774 podStartE2EDuration="1m9.448784547s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:03:36.142653537 +0000 UTC m=+66.578197669" lastFinishedPulling="2026-04-20 15:03:38.712644295 +0000 UTC m=+69.148188442" observedRunningTime="2026-04-20 15:03:39.448217284 +0000 UTC m=+69.883761428" watchObservedRunningTime="2026-04-20 15:03:39.448784547 +0000 UTC m=+69.884328717" Apr 20 15:04:06.848956 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:06.848914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:04:06.849410 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:06.848984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:04:06.849410 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:06.849105 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:04:06.849410 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:06.849105 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:04:06.849410 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:06.849212 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:10.849191737 +0000 UTC m=+161.284735892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:04:06.849410 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:06.849242 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:10.849227663 +0000 UTC m=+161.284771803 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:04:10.436800 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:10.436770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z8nvj" Apr 20 15:04:39.873480 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:39.873423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:04:39.873983 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:39.873582 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:04:39.873983 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:39.873660 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs podName:5987592a-660d-4466-bbc4-5bd812cca838 nodeName:}" failed. No retries permitted until 2026-04-20 15:06:41.873642657 +0000 UTC m=+252.309186812 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs") pod "network-metrics-daemon-dp887" (UID: "5987592a-660d-4466-bbc4-5bd812cca838") : secret "metrics-daemon-secret" not found Apr 20 15:04:52.139001 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.138967 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp"] Apr 20 15:04:52.141715 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.141689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" Apr 20 15:04:52.142149 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.142124 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mgv7v"] Apr 20 15:04:52.144749 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.144727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.145151 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.145130 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-8649c78dc4-vwfc7"] Apr 20 15:04:52.145327 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.145308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.145484 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.145466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.146153 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.146135 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fzcbz\"" Apr 20 15:04:52.147299 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147258 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 15:04:52.147417 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147403 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.147775 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147758 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.147936 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vt8tc\"" Apr 20 15:04:52.148044 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.148044 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.147979 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 15:04:52.150261 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150211 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-842kq\"" Apr 20 15:04:52.150468 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 15:04:52.150773 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150756 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 15:04:52.150852 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150773 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 15:04:52.150913 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 15:04:52.150913 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.151021 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.150916 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.153866 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.153844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp"] Apr 20 15:04:52.156799 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.156781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 15:04:52.160126 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.160101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mgv7v"] Apr 20 15:04:52.165255 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.165235 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8649c78dc4-vwfc7"] Apr 20 15:04:52.233176 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.233144 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs"] Apr 20 15:04:52.236114 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.236097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.238407 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.238382 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.238548 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.238408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 15:04:52.238548 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.238412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-dhpsn\"" Apr 20 15:04:52.238548 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.238383 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.246070 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.246046 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs"] Apr 20 15:04:52.259418 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-stats-auth\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.259418 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-default-certificate\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg48h\" (UniqueName: \"kubernetes.io/projected/a56509b8-31c7-4b3a-b397-2eff1d2f128c-kube-api-access-jg48h\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-tmp\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-service-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23ae8320-d3f3-4b75-8ae2-136783ad218b-serving-cert\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.259611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxr8b\" (UniqueName: \"kubernetes.io/projected/4d9bf241-422c-4da3-9dc2-cb3f82bb8201-kube-api-access-rxr8b\") pod \"volume-data-source-validator-7c6cbb6c87-dsskp\" (UID: \"4d9bf241-422c-4da3-9dc2-cb3f82bb8201\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" Apr 20 15:04:52.259850 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-snapshots\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.259850 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.259643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fbkg\" (UniqueName: \"kubernetes.io/projected/23ae8320-d3f3-4b75-8ae2-136783ad218b-kube-api-access-5fbkg\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.363397 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.363255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-default-certificate\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.363397 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.363361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.363649 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.363395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg48h\" (UniqueName: \"kubernetes.io/projected/a56509b8-31c7-4b3a-b397-2eff1d2f128c-kube-api-access-jg48h\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.363912 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.363521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-tmp\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.363963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.363950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-service-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364016 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.363967 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:52.8639478 +0000 UTC m=+143.299491954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:04:52.364016 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23ae8320-d3f3-4b75-8ae2-136783ad218b-serving-cert\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364125 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.364125 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxr8b\" (UniqueName: \"kubernetes.io/projected/4d9bf241-422c-4da3-9dc2-cb3f82bb8201-kube-api-access-rxr8b\") pod \"volume-data-source-validator-7c6cbb6c87-dsskp\" (UID: \"4d9bf241-422c-4da3-9dc2-cb3f82bb8201\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" Apr 20 15:04:52.364125 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-snapshots\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364267 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbkg\" (UniqueName: \"kubernetes.io/projected/23ae8320-d3f3-4b75-8ae2-136783ad218b-kube-api-access-5fbkg\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364267 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.364267 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.364195 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:04:52.364267 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.364266 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:52.864246512 +0000 UTC m=+143.299790661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:04:52.364515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrmq\" (UniqueName: \"kubernetes.io/projected/2ef279ba-4008-4737-a4a5-fc05ae27cab5-kube-api-access-wqrmq\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.364515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-stats-auth\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.364515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-service-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364808 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-tmp\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.364954 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.364933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23ae8320-d3f3-4b75-8ae2-136783ad218b-snapshots\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.365394 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.365369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23ae8320-d3f3-4b75-8ae2-136783ad218b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.366523 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.366494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-default-certificate\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.366729 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.366709 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23ae8320-d3f3-4b75-8ae2-136783ad218b-serving-cert\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.366879 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.366864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-stats-auth\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.373164 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.373139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg48h\" (UniqueName: \"kubernetes.io/projected/a56509b8-31c7-4b3a-b397-2eff1d2f128c-kube-api-access-jg48h\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.373407 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.373388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxr8b\" (UniqueName: \"kubernetes.io/projected/4d9bf241-422c-4da3-9dc2-cb3f82bb8201-kube-api-access-rxr8b\") pod \"volume-data-source-validator-7c6cbb6c87-dsskp\" (UID: \"4d9bf241-422c-4da3-9dc2-cb3f82bb8201\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" Apr 20 15:04:52.373480 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.373430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fbkg\" (UniqueName: \"kubernetes.io/projected/23ae8320-d3f3-4b75-8ae2-136783ad218b-kube-api-access-5fbkg\") pod \"insights-operator-585dfdc468-mgv7v\" (UID: \"23ae8320-d3f3-4b75-8ae2-136783ad218b\") " pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.453277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.453177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" Apr 20 15:04:52.461087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.461051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" Apr 20 15:04:52.465175 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.465143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.465329 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.465241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrmq\" (UniqueName: \"kubernetes.io/projected/2ef279ba-4008-4737-a4a5-fc05ae27cab5-kube-api-access-wqrmq\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.465329 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.465307 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:04:52.465438 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.465395 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:52.965374992 +0000 UTC m=+143.400919139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:04:52.474226 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.474198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrmq\" (UniqueName: \"kubernetes.io/projected/2ef279ba-4008-4737-a4a5-fc05ae27cab5-kube-api-access-wqrmq\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.577541 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.577509 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp"] Apr 20 15:04:52.580325 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:04:52.580299 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9bf241_422c_4da3_9dc2_cb3f82bb8201.slice/crio-0447ea82aaf9c3fd0c54bbd0186234208662b76f96a7bf29e4985f6893fc1162 WatchSource:0}: Error finding container 0447ea82aaf9c3fd0c54bbd0186234208662b76f96a7bf29e4985f6893fc1162: Status 404 returned error can't find the container with id 0447ea82aaf9c3fd0c54bbd0186234208662b76f96a7bf29e4985f6893fc1162 Apr 20 15:04:52.591819 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.591794 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mgv7v"] Apr 20 15:04:52.594678 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:04:52.594647 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23ae8320_d3f3_4b75_8ae2_136783ad218b.slice/crio-f4b2e11ed2f9aa0bad464a72dc9d9d846f94df64c361b85c0ab6d8d7d6d2c960 WatchSource:0}: Error finding container f4b2e11ed2f9aa0bad464a72dc9d9d846f94df64c361b85c0ab6d8d7d6d2c960: Status 404 returned error can't find the container with id f4b2e11ed2f9aa0bad464a72dc9d9d846f94df64c361b85c0ab6d8d7d6d2c960 Apr 20 15:04:52.868729 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.868695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.868729 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.868736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:52.868953 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.868867 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:04:52.868953 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.868896 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:53.868877418 +0000 UTC m=+144.304421571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:04:52.868953 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.868919 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:53.868911518 +0000 UTC m=+144.304455654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:04:52.969826 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:52.969772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:52.970017 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.969923 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:04:52.970017 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:52.969990 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:53.969973199 +0000 UTC m=+144.405517331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:04:53.574184 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:53.574109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" event={"ID":"23ae8320-d3f3-4b75-8ae2-136783ad218b","Type":"ContainerStarted","Data":"f4b2e11ed2f9aa0bad464a72dc9d9d846f94df64c361b85c0ab6d8d7d6d2c960"} Apr 20 15:04:53.575356 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:53.575321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" event={"ID":"4d9bf241-422c-4da3-9dc2-cb3f82bb8201","Type":"ContainerStarted","Data":"0447ea82aaf9c3fd0c54bbd0186234208662b76f96a7bf29e4985f6893fc1162"} Apr 20 15:04:53.876136 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:53.876039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:53.876136 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:53.876100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:53.876372 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:53.876230 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:55.876208315 +0000 UTC m=+146.311752463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:04:53.876372 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:53.876227 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:04:53.876372 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:53.876276 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:55.876266515 +0000 UTC m=+146.311810650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:04:53.977094 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:53.977058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:53.977315 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:53.977210 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:04:53.977315 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:53.977280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:55.977261685 +0000 UTC m=+146.412805832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:04:54.578356 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:54.578318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" event={"ID":"4d9bf241-422c-4da3-9dc2-cb3f82bb8201","Type":"ContainerStarted","Data":"c8de055d1d0f6495543c5b7c2a0c8549c1369a768425094cdbe2656ea548bfbc"} Apr 20 15:04:54.592925 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:54.592864 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dsskp" podStartSLOduration=0.964554814 podStartE2EDuration="2.592847606s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="2026-04-20 15:04:52.582121406 +0000 UTC m=+143.017665538" lastFinishedPulling="2026-04-20 15:04:54.210414184 +0000 UTC m=+144.645958330" observedRunningTime="2026-04-20 15:04:54.591958309 +0000 UTC m=+145.027502489" watchObservedRunningTime="2026-04-20 15:04:54.592847606 +0000 UTC m=+145.028391760" Apr 20 15:04:55.581924 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:55.581883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" event={"ID":"23ae8320-d3f3-4b75-8ae2-136783ad218b","Type":"ContainerStarted","Data":"3f5778dff576eff6538431f7fa883057a1d6003bdd884043ed92d79c2a4a7f9e"} Apr 20 15:04:55.600995 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:55.600942 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" podStartSLOduration=1.431322659 podStartE2EDuration="3.600924422s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="2026-04-20 15:04:52.596422895 +0000 UTC m=+143.031967027" lastFinishedPulling="2026-04-20 15:04:54.766024645 +0000 UTC m=+145.201568790" observedRunningTime="2026-04-20 15:04:55.600167358 +0000 UTC m=+146.035711516" watchObservedRunningTime="2026-04-20 15:04:55.600924422 +0000 UTC m=+146.036468578" Apr 20 15:04:55.892612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:55.892512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:55.892612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:55.892560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:55.892838 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:55.892682 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:04:55.892838 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:55.892716 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.892693752 +0000 UTC m=+150.328237885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:04:55.892838 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:55.892747 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.892737849 +0000 UTC m=+150.328281987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:04:55.993787 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:55.993751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:04:55.994001 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:55.993916 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:04:55.994065 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:55.994002 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.993982832 +0000 UTC m=+150.429526983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:04:57.949132 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:57.949103 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc5nc_423472ab-8267-4409-9aa0-f1d4a9c14e79/dns-node-resolver/0.log" Apr 20 15:04:58.948795 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:58.948766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7vhjb_3b19ab41-5f62-4594-9262-8789718fb9e9/node-ca/0.log" Apr 20 15:04:59.072078 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.072045 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q"] Apr 20 15:04:59.075076 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.075060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.077563 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.077541 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 15:04:59.077691 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.077657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 15:04:59.077765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.077750 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 15:04:59.077803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.077751 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-nw8wx\"" Apr 20 15:04:59.078330 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.078317 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:04:59.085125 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.085100 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q"] Apr 20 15:04:59.219667 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.219636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd391b00-3f8a-4173-b08a-659c434b7b1a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.219833 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.219672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd391b00-3f8a-4173-b08a-659c434b7b1a-config\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.219833 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.219692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bk7\" (UniqueName: \"kubernetes.io/projected/fd391b00-3f8a-4173-b08a-659c434b7b1a-kube-api-access-v9bk7\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.320717 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.320678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd391b00-3f8a-4173-b08a-659c434b7b1a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.320717 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.320715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd391b00-3f8a-4173-b08a-659c434b7b1a-config\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.320884 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.320742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bk7\" (UniqueName: \"kubernetes.io/projected/fd391b00-3f8a-4173-b08a-659c434b7b1a-kube-api-access-v9bk7\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.321226 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.321206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd391b00-3f8a-4173-b08a-659c434b7b1a-config\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.323085 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.323067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd391b00-3f8a-4173-b08a-659c434b7b1a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.329222 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.329199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bk7\" (UniqueName: \"kubernetes.io/projected/fd391b00-3f8a-4173-b08a-659c434b7b1a-kube-api-access-v9bk7\") pod \"service-ca-operator-d6fc45fc5-c6d9q\" (UID: \"fd391b00-3f8a-4173-b08a-659c434b7b1a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.383784 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.383732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" Apr 20 15:04:59.499067 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.498987 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q"] Apr 20 15:04:59.502942 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:04:59.502912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd391b00_3f8a_4173_b08a_659c434b7b1a.slice/crio-6bd4d1df3f5aa93d2f66efef2e32924cc1fa326671b2005515d6b23d1a6cfd84 WatchSource:0}: Error finding container 6bd4d1df3f5aa93d2f66efef2e32924cc1fa326671b2005515d6b23d1a6cfd84: Status 404 returned error can't find the container with id 6bd4d1df3f5aa93d2f66efef2e32924cc1fa326671b2005515d6b23d1a6cfd84 Apr 20 15:04:59.591189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.591152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" event={"ID":"fd391b00-3f8a-4173-b08a-659c434b7b1a","Type":"ContainerStarted","Data":"6bd4d1df3f5aa93d2f66efef2e32924cc1fa326671b2005515d6b23d1a6cfd84"} Apr 20 15:04:59.925471 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.925371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:59.925471 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:04:59.925423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:04:59.925683 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:59.925591 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:05:07.925567365 +0000 UTC m=+158.361111500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:04:59.925683 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:59.925596 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:04:59.925683 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:04:59.925640 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:05:07.925629654 +0000 UTC m=+158.361173786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:05:00.026145 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:00.026103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:05:00.026349 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:00.026217 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:05:00.026349 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:00.026273 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:08.026257921 +0000 UTC m=+158.461802053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:05:01.349925 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.349887 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7"] Apr 20 15:05:01.354432 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.354412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" Apr 20 15:05:01.356895 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.356869 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 15:05:01.357714 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.357691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wn899\"" Apr 20 15:05:01.359376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.359328 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 15:05:01.361021 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.360999 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7"] Apr 20 15:05:01.439321 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.439260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx52g\" (UniqueName: \"kubernetes.io/projected/c1c75800-433e-4b2e-a48d-b70c37d34e5a-kube-api-access-zx52g\") pod \"migrator-74bb7799d9-c8wr7\" (UID: \"c1c75800-433e-4b2e-a48d-b70c37d34e5a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" Apr 20 15:05:01.539980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.539939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx52g\" (UniqueName: \"kubernetes.io/projected/c1c75800-433e-4b2e-a48d-b70c37d34e5a-kube-api-access-zx52g\") pod \"migrator-74bb7799d9-c8wr7\" (UID: \"c1c75800-433e-4b2e-a48d-b70c37d34e5a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" Apr 20 15:05:01.549231 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.549200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx52g\" (UniqueName: \"kubernetes.io/projected/c1c75800-433e-4b2e-a48d-b70c37d34e5a-kube-api-access-zx52g\") pod \"migrator-74bb7799d9-c8wr7\" (UID: \"c1c75800-433e-4b2e-a48d-b70c37d34e5a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" Apr 20 15:05:01.666007 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.665919 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" Apr 20 15:05:01.802238 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:01.802212 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7"] Apr 20 15:05:01.806727 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:01.806705 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c75800_433e_4b2e_a48d_b70c37d34e5a.slice/crio-33e6932cdc390d27989a9a49625a196f499a22bcea53aeba81de633a55033507 WatchSource:0}: Error finding container 33e6932cdc390d27989a9a49625a196f499a22bcea53aeba81de633a55033507: Status 404 returned error can't find the container with id 33e6932cdc390d27989a9a49625a196f499a22bcea53aeba81de633a55033507 Apr 20 15:05:02.599441 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:02.599403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" event={"ID":"fd391b00-3f8a-4173-b08a-659c434b7b1a","Type":"ContainerStarted","Data":"76eead5349f19a46451dfad26486f21c3feac0f1165a9273e6a99ec754f2fd1e"} Apr 20 15:05:02.600407 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:02.600386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" event={"ID":"c1c75800-433e-4b2e-a48d-b70c37d34e5a","Type":"ContainerStarted","Data":"33e6932cdc390d27989a9a49625a196f499a22bcea53aeba81de633a55033507"} Apr 20 15:05:02.615443 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:02.615389 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" podStartSLOduration=1.382646244 podStartE2EDuration="3.615374225s" podCreationTimestamp="2026-04-20 15:04:59 +0000 UTC" firstStartedPulling="2026-04-20 15:04:59.504852906 +0000 UTC m=+149.940397039" lastFinishedPulling="2026-04-20 15:05:01.737580887 +0000 UTC m=+152.173125020" observedRunningTime="2026-04-20 15:05:02.614517117 +0000 UTC m=+153.050061295" watchObservedRunningTime="2026-04-20 15:05:02.615374225 +0000 UTC m=+153.050918378" Apr 20 15:05:03.604176 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:03.604141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" event={"ID":"c1c75800-433e-4b2e-a48d-b70c37d34e5a","Type":"ContainerStarted","Data":"4710d1d3d524d4225eb05e151be846cd30c84707f34376f607c668ef08fcf953"} Apr 20 15:05:03.604176 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:03.604177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" event={"ID":"c1c75800-433e-4b2e-a48d-b70c37d34e5a","Type":"ContainerStarted","Data":"2ee20c179a326c98f766f9ef676476fd7f7cb8a0912b0585db3753fb4851c393"} Apr 20 15:05:03.621321 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:03.621248 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c8wr7" podStartSLOduration=1.418646283 podStartE2EDuration="2.621232222s" podCreationTimestamp="2026-04-20 15:05:01 +0000 UTC" firstStartedPulling="2026-04-20 15:05:01.809017999 +0000 UTC m=+152.244562131" lastFinishedPulling="2026-04-20 15:05:03.011603939 +0000 UTC m=+153.447148070" observedRunningTime="2026-04-20 15:05:03.62073211 +0000 UTC m=+154.056276289" watchObservedRunningTime="2026-04-20 15:05:03.621232222 +0000 UTC m=+154.056776376" Apr 20 15:05:05.962527 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:05.962478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rwbpz" podUID="548662da-3b01-418f-b71e-7805525a03e5" Apr 20 15:05:05.977951 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:05.977903 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7rhcs" podUID="cbcbf670-2941-48c2-8a4a-b5f253135d10" Apr 20 15:05:06.612779 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:06.612703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:05:06.612921 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:06.612703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:05:07.205430 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:07.205382 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dp887" podUID="5987592a-660d-4466-bbc4-5bd812cca838" Apr 20 15:05:07.991357 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:07.991318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:07.991565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:07.991366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:07.991565 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:07.991496 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:05:23.991477046 +0000 UTC m=+174.427021189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : configmap references non-existent config key: service-ca.crt Apr 20 15:05:07.991699 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:07.991582 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 15:05:07.991699 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:07.991661 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs podName:a56509b8-31c7-4b3a-b397-2eff1d2f128c nodeName:}" failed. No retries permitted until 2026-04-20 15:05:23.991641463 +0000 UTC m=+174.427185614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs") pod "router-default-8649c78dc4-vwfc7" (UID: "a56509b8-31c7-4b3a-b397-2eff1d2f128c") : secret "router-metrics-certs-default" not found Apr 20 15:05:08.092055 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:08.092017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:05:08.092253 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:08.092143 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 15:05:08.092253 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:08.092198 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls podName:2ef279ba-4008-4737-a4a5-fc05ae27cab5 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:24.092183442 +0000 UTC m=+174.527727575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-brkfs" (UID: "2ef279ba-4008-4737-a4a5-fc05ae27cab5") : secret "samples-operator-tls" not found Apr 20 15:05:10.916517 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:10.916438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:05:10.916892 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:10.916517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:05:10.916892 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:10.916594 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:05:10.916892 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:10.916658 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert podName:cbcbf670-2941-48c2-8a4a-b5f253135d10 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:12.916640598 +0000 UTC m=+283.352184750 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert") pod "ingress-canary-7rhcs" (UID: "cbcbf670-2941-48c2-8a4a-b5f253135d10") : secret "canary-serving-cert" not found Apr 20 15:05:10.916892 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:10.916668 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:05:10.916892 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:10.916727 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls podName:548662da-3b01-418f-b71e-7805525a03e5 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:12.916709597 +0000 UTC m=+283.352253730 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls") pod "dns-default-rwbpz" (UID: "548662da-3b01-418f-b71e-7805525a03e5") : secret "dns-default-metrics-tls" not found Apr 20 15:05:18.191377 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:18.191338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:05:24.020007 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.019965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:24.020007 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.020012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:24.020643 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.020624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a56509b8-31c7-4b3a-b397-2eff1d2f128c-service-ca-bundle\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:24.022391 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.022373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a56509b8-31c7-4b3a-b397-2eff1d2f128c-metrics-certs\") pod \"router-default-8649c78dc4-vwfc7\" (UID: \"a56509b8-31c7-4b3a-b397-2eff1d2f128c\") " pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:24.120888 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.120846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:05:24.123381 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.123350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ef279ba-4008-4737-a4a5-fc05ae27cab5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-brkfs\" (UID: \"2ef279ba-4008-4737-a4a5-fc05ae27cab5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:05:24.267097 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.267052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:24.344597 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.344566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" Apr 20 15:05:24.392240 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.392052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8649c78dc4-vwfc7"] Apr 20 15:05:24.398591 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:24.398549 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda56509b8_31c7_4b3a_b397_2eff1d2f128c.slice/crio-c949be213e1ca7240f0613a55468a10a42b07185703252e9a801839cdae82969 WatchSource:0}: Error finding container c949be213e1ca7240f0613a55468a10a42b07185703252e9a801839cdae82969: Status 404 returned error can't find the container with id c949be213e1ca7240f0613a55468a10a42b07185703252e9a801839cdae82969 Apr 20 15:05:24.464075 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.464048 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs"] Apr 20 15:05:24.655323 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.655221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" event={"ID":"2ef279ba-4008-4737-a4a5-fc05ae27cab5","Type":"ContainerStarted","Data":"0468fcc8205bae992686d739a716bfc5e2fc4fd6ca4209b0ec1c962491f51bb9"} Apr 20 15:05:24.656390 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.656359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" event={"ID":"a56509b8-31c7-4b3a-b397-2eff1d2f128c","Type":"ContainerStarted","Data":"6b6f0a6a38420ab4438930764f16eb72b7ff870c2dc710b5d3c0687598796117"} Apr 20 15:05:24.656390 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.656392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" event={"ID":"a56509b8-31c7-4b3a-b397-2eff1d2f128c","Type":"ContainerStarted","Data":"c949be213e1ca7240f0613a55468a10a42b07185703252e9a801839cdae82969"} Apr 20 15:05:24.674976 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:24.674925 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" podStartSLOduration=32.674909974 podStartE2EDuration="32.674909974s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:05:24.674100566 +0000 UTC m=+175.109644747" watchObservedRunningTime="2026-04-20 15:05:24.674909974 +0000 UTC m=+175.110454128" Apr 20 15:05:25.267739 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:25.267705 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:25.270351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:25.270327 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:25.659197 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:25.659097 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:25.660718 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:25.660694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-8649c78dc4-vwfc7" Apr 20 15:05:27.665639 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:27.665597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" event={"ID":"2ef279ba-4008-4737-a4a5-fc05ae27cab5","Type":"ContainerStarted","Data":"6e777941ac3f01881d9fd39803a4ad6bca511758a895ea6043df7c442759b34b"} Apr 20 15:05:27.665639 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:27.665639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" event={"ID":"2ef279ba-4008-4737-a4a5-fc05ae27cab5","Type":"ContainerStarted","Data":"77fd5fb38763b9e76da9f283745a5515af091795630b4b8c776c876a12276d1c"} Apr 20 15:05:27.680484 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:27.680429 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-brkfs" podStartSLOduration=33.324202045 podStartE2EDuration="35.680414719s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="2026-04-20 15:05:24.508554598 +0000 UTC m=+174.944098731" lastFinishedPulling="2026-04-20 15:05:26.864767273 +0000 UTC m=+177.300311405" observedRunningTime="2026-04-20 15:05:27.679845543 +0000 UTC m=+178.115389699" watchObservedRunningTime="2026-04-20 15:05:27.680414719 +0000 UTC m=+178.115958873" Apr 20 15:05:28.219505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.219461 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8mrp8"] Apr 20 15:05:28.222703 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.222685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.225894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.225868 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8mlqf\"" Apr 20 15:05:28.225894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.225888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 15:05:28.226135 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.225896 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 15:05:28.233004 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.232982 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mrp8"] Apr 20 15:05:28.252547 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.252516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.252547 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.252552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2rf\" (UniqueName: \"kubernetes.io/projected/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-api-access-nv2rf\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.252735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.252573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-data-volume\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.252735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.252634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-crio-socket\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.252735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.252657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.299601 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.299565 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-c2qbw"] Apr 20 15:05:28.302691 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.302667 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:28.304229 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.304208 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p"] Apr 20 15:05:28.306102 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.306081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5lf5d\"" Apr 20 15:05:28.306215 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.306098 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 15:05:28.306416 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.306398 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 15:05:28.306993 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.306975 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b959fb7fc-fmqvl"] Apr 20 15:05:28.307145 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.307128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:28.309951 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.309933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.310550 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.310531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-wwq4v\"" Apr 20 15:05:28.310761 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.310562 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 15:05:28.312511 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.312495 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfkdq\"" Apr 20 15:05:28.312686 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.312673 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 15:05:28.312736 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.312701 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 15:05:28.312923 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.312910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 15:05:28.318564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.318109 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 15:05:28.319942 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.319916 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p"] Apr 20 15:05:28.321202 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.321180 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c2qbw"] Apr 20 15:05:28.325543 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.325524 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b959fb7fc-fmqvl"] Apr 20 15:05:28.353867 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.353840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-bound-sa-token\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.353874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62be979d-08db-4830-a936-48380c484f67-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xkh5p\" (UID: \"62be979d-08db-4830-a936-48380c484f67\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.353901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-crio-socket\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.353923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-data-volume\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.353983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-registry-certificates\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqnst\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-kube-api-access-fqnst\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-crio-socket\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6w26\" (UniqueName: \"kubernetes.io/projected/9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41-kube-api-access-q6w26\") pod \"downloads-6bcc868b7-c2qbw\" (UID: \"9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41\") " pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2rf\" (UniqueName: \"kubernetes.io/projected/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-api-access-nv2rf\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-data-volume\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-registry-tls\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-installation-pull-secrets\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354302 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36c8144f-abb6-49c5-b926-7737d427688b-ca-trust-extracted\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354616 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-trusted-ca\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354616 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-image-registry-private-configuration\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.354616 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.354512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.356274 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.356256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.361745 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.361719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2rf\" (UniqueName: \"kubernetes.io/projected/f34a05dd-26aa-4559-a6d7-361a7c4e19c8-kube-api-access-nv2rf\") pod \"insights-runtime-extractor-8mrp8\" (UID: \"f34a05dd-26aa-4559-a6d7-361a7c4e19c8\") " pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.455660 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-installation-pull-secrets\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455660 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36c8144f-abb6-49c5-b926-7737d427688b-ca-trust-extracted\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455660 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-trusted-ca\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-image-registry-private-configuration\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-bound-sa-token\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62be979d-08db-4830-a936-48380c484f67-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xkh5p\" (UID: \"62be979d-08db-4830-a936-48380c484f67\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-registry-certificates\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnst\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-kube-api-access-fqnst\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.455945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.455913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6w26\" (UniqueName: \"kubernetes.io/projected/9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41-kube-api-access-q6w26\") pod \"downloads-6bcc868b7-c2qbw\" (UID: \"9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41\") " pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:28.456225 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.456118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-registry-tls\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.456575 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.456546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36c8144f-abb6-49c5-b926-7737d427688b-ca-trust-extracted\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.456723 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.456693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-registry-certificates\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.456856 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.456818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36c8144f-abb6-49c5-b926-7737d427688b-trusted-ca\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.458487 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.458461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-image-registry-private-configuration\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.458584 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.458493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62be979d-08db-4830-a936-48380c484f67-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xkh5p\" (UID: \"62be979d-08db-4830-a936-48380c484f67\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:28.458584 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.458525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36c8144f-abb6-49c5-b926-7737d427688b-installation-pull-secrets\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.458659 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.458603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-registry-tls\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.472874 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.472840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6w26\" (UniqueName: \"kubernetes.io/projected/9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41-kube-api-access-q6w26\") pod \"downloads-6bcc868b7-c2qbw\" (UID: \"9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41\") " pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:28.472986 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.472908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-bound-sa-token\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.472986 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.472922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqnst\" (UniqueName: \"kubernetes.io/projected/36c8144f-abb6-49c5-b926-7737d427688b-kube-api-access-fqnst\") pod \"image-registry-b959fb7fc-fmqvl\" (UID: \"36c8144f-abb6-49c5-b926-7737d427688b\") " pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.532903 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.532854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mrp8" Apr 20 15:05:28.614309 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.614255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:28.622358 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.622328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:28.628183 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.628154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:28.656605 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.656545 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mrp8"] Apr 20 15:05:28.670141 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.670063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mrp8" event={"ID":"f34a05dd-26aa-4559-a6d7-361a7c4e19c8","Type":"ContainerStarted","Data":"975eb68bec9e78d231e287119000a57fd900f96d319a6376c774d21fc87f3392"} Apr 20 15:05:28.787961 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.787917 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c2qbw"] Apr 20 15:05:28.791114 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:28.791081 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c3cc9ef_d0f3_4ac1_b2cb_c8a54c30ac41.slice/crio-57ebe890b6249adb05325e3e5a7824ec3616c7c992e051022f2b807ff92d815e WatchSource:0}: Error finding container 57ebe890b6249adb05325e3e5a7824ec3616c7c992e051022f2b807ff92d815e: Status 404 returned error can't find the container with id 57ebe890b6249adb05325e3e5a7824ec3616c7c992e051022f2b807ff92d815e Apr 20 15:05:28.798705 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.798650 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p"] Apr 20 15:05:28.802496 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:28.802472 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62be979d_08db_4830_a936_48380c484f67.slice/crio-d4106164e70b21823c10e5e339327ec6275a332c8e02aa13fb274b93205629bd WatchSource:0}: Error finding container d4106164e70b21823c10e5e339327ec6275a332c8e02aa13fb274b93205629bd: Status 404 returned error can't find the container with id d4106164e70b21823c10e5e339327ec6275a332c8e02aa13fb274b93205629bd Apr 20 15:05:28.817082 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:28.817053 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b959fb7fc-fmqvl"] Apr 20 15:05:28.820083 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:28.820059 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c8144f_abb6_49c5_b926_7737d427688b.slice/crio-26a180117211e28c9ea542bd7f990be83d05cbca220a1d52f0cb3bd1ec038d7d WatchSource:0}: Error finding container 26a180117211e28c9ea542bd7f990be83d05cbca220a1d52f0cb3bd1ec038d7d: Status 404 returned error can't find the container with id 26a180117211e28c9ea542bd7f990be83d05cbca220a1d52f0cb3bd1ec038d7d Apr 20 15:05:29.674592 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.674550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c2qbw" event={"ID":"9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41","Type":"ContainerStarted","Data":"57ebe890b6249adb05325e3e5a7824ec3616c7c992e051022f2b807ff92d815e"} Apr 20 15:05:29.677149 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.677112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mrp8" event={"ID":"f34a05dd-26aa-4559-a6d7-361a7c4e19c8","Type":"ContainerStarted","Data":"be6b8d761d73b688bf3b8847818bc3432507be23d00bea64c7ba2c64eba4dbee"} Apr 20 15:05:29.677318 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.677153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mrp8" event={"ID":"f34a05dd-26aa-4559-a6d7-361a7c4e19c8","Type":"ContainerStarted","Data":"5ed2e3fffb8199efda11a8b740b9e8b14df43012f5aacac618d82e40a4bac3db"} Apr 20 15:05:29.679229 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.679196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerStarted","Data":"27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325"} Apr 20 15:05:29.679375 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.679236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerStarted","Data":"26a180117211e28c9ea542bd7f990be83d05cbca220a1d52f0cb3bd1ec038d7d"} Apr 20 15:05:29.679375 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.679274 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:05:29.680565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.680526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" event={"ID":"62be979d-08db-4830-a936-48380c484f67","Type":"ContainerStarted","Data":"d4106164e70b21823c10e5e339327ec6275a332c8e02aa13fb274b93205629bd"} Apr 20 15:05:29.696331 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:29.696265 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podStartSLOduration=1.696246395 podStartE2EDuration="1.696246395s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:05:29.695546517 +0000 UTC m=+180.131090674" watchObservedRunningTime="2026-04-20 15:05:29.696246395 +0000 UTC m=+180.131790550" Apr 20 15:05:30.685383 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:30.685339 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" event={"ID":"62be979d-08db-4830-a936-48380c484f67","Type":"ContainerStarted","Data":"b177d84eeafb12424906c5ebaa2efbe54f0cad56a35e76098374e3960cada62e"} Apr 20 15:05:30.700090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:30.700035 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" podStartSLOduration=1.6066802679999999 podStartE2EDuration="2.700017733s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.804654965 +0000 UTC m=+179.240199100" lastFinishedPulling="2026-04-20 15:05:29.897992421 +0000 UTC m=+180.333536565" observedRunningTime="2026-04-20 15:05:30.69909483 +0000 UTC m=+181.134638987" watchObservedRunningTime="2026-04-20 15:05:30.700017733 +0000 UTC m=+181.135561887" Apr 20 15:05:31.688590 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:31.688561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:31.693436 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:31.693411 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xkh5p" Apr 20 15:05:32.152221 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.152178 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fdghk"] Apr 20 15:05:32.155483 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.155451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158591 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-g6wk8\"" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158612 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158565 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 15:05:32.158767 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.158760 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 15:05:32.164312 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.164276 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fdghk"] Apr 20 15:05:32.185704 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.185666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.185958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.185780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.185958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.185812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87tlt\" (UniqueName: \"kubernetes.io/projected/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-kube-api-access-87tlt\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.185958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.185861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.287306 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.287248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.287509 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.287334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87tlt\" (UniqueName: \"kubernetes.io/projected/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-kube-api-access-87tlt\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.287509 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.287396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.287509 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.287435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.287684 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:32.287556 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 15:05:32.287684 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:32.287617 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls podName:c85d567e-27dc-4cfa-bcf6-f4b3181c4e62 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:32.787597369 +0000 UTC m=+183.223141515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-fdghk" (UID: "c85d567e-27dc-4cfa-bcf6-f4b3181c4e62") : secret "prometheus-operator-tls" not found Apr 20 15:05:32.288310 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.288255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.290014 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.289985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.296361 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.296335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87tlt\" (UniqueName: \"kubernetes.io/projected/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-kube-api-access-87tlt\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.694230 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.694194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mrp8" event={"ID":"f34a05dd-26aa-4559-a6d7-361a7c4e19c8","Type":"ContainerStarted","Data":"a5c37778321958dd7ccefa47d8fda8dfd08283f497ac5e85dddd6c0bf187038d"} Apr 20 15:05:32.713226 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.713160 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8mrp8" podStartSLOduration=1.771811888 podStartE2EDuration="4.713139926s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.762154329 +0000 UTC m=+179.197698464" lastFinishedPulling="2026-04-20 15:05:31.703482371 +0000 UTC m=+182.139026502" observedRunningTime="2026-04-20 15:05:32.712967324 +0000 UTC m=+183.148511696" watchObservedRunningTime="2026-04-20 15:05:32.713139926 +0000 UTC m=+183.148684081" Apr 20 15:05:32.791839 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:32.791789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:32.792018 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:32.791867 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 15:05:32.792018 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:05:32.791939 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls podName:c85d567e-27dc-4cfa-bcf6-f4b3181c4e62 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:33.791918111 +0000 UTC m=+184.227462260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-fdghk" (UID: "c85d567e-27dc-4cfa-bcf6-f4b3181c4e62") : secret "prometheus-operator-tls" not found Apr 20 15:05:33.801221 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:33.801171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:33.804013 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:33.803985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c85d567e-27dc-4cfa-bcf6-f4b3181c4e62-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fdghk\" (UID: \"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:33.967050 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:33.967013 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" Apr 20 15:05:34.098785 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:34.098747 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fdghk"] Apr 20 15:05:34.102027 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:34.101993 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85d567e_27dc_4cfa_bcf6_f4b3181c4e62.slice/crio-d7ad3b00ecaa48124e8242dc132f52717b9e9334ad8ccbcad56c7d550045fdf9 WatchSource:0}: Error finding container d7ad3b00ecaa48124e8242dc132f52717b9e9334ad8ccbcad56c7d550045fdf9: Status 404 returned error can't find the container with id d7ad3b00ecaa48124e8242dc132f52717b9e9334ad8ccbcad56c7d550045fdf9 Apr 20 15:05:34.701263 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:34.701223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" event={"ID":"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62","Type":"ContainerStarted","Data":"d7ad3b00ecaa48124e8242dc132f52717b9e9334ad8ccbcad56c7d550045fdf9"} Apr 20 15:05:35.707189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:35.707089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" event={"ID":"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62","Type":"ContainerStarted","Data":"cf21da530702bdce1d5482649e664b8ab4637adbe1d1fce825afdb064a4cd286"} Apr 20 15:05:35.707189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:35.707137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" event={"ID":"c85d567e-27dc-4cfa-bcf6-f4b3181c4e62","Type":"ContainerStarted","Data":"d95f64770a755212e88f3d62a2260b457fd2741b0299463be9d73cb9db9db81d"} Apr 20 15:05:35.726219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:35.726144 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-fdghk" podStartSLOduration=2.2907865 podStartE2EDuration="3.726122832s" podCreationTimestamp="2026-04-20 15:05:32 +0000 UTC" firstStartedPulling="2026-04-20 15:05:34.104212021 +0000 UTC m=+184.539756153" lastFinishedPulling="2026-04-20 15:05:35.539548353 +0000 UTC m=+185.975092485" observedRunningTime="2026-04-20 15:05:35.725138566 +0000 UTC m=+186.160682721" watchObservedRunningTime="2026-04-20 15:05:35.726122832 +0000 UTC m=+186.161666990" Apr 20 15:05:37.514073 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.514034 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bhvzv"] Apr 20 15:05:37.517769 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.517745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.520369 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.520344 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lk8g4\"" Apr 20 15:05:37.520540 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.520515 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 15:05:37.520656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.520512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 15:05:37.520656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.520567 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 15:05:37.634669 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.634933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-tls\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.634933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-textfile\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.634933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrvm\" (UniqueName: \"kubernetes.io/projected/65bc4e5e-758a-43c0-abbc-de79866d22b8-kube-api-access-kcrvm\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.634933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-wtmp\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.634933 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-sys\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.635157 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-root\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.635157 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.634977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.635157 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.635029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-metrics-client-ca\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-sys\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-root\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-sys\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-metrics-client-ca\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-root\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736277 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-tls\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-textfile\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrvm\" (UniqueName: \"kubernetes.io/projected/65bc4e5e-758a-43c0-abbc-de79866d22b8-kube-api-access-kcrvm\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-wtmp\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-wtmp\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.736873 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.736837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.737108 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.737060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65bc4e5e-758a-43c0-abbc-de79866d22b8-metrics-client-ca\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.737108 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.737080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-textfile\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.739114 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.739091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-tls\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.739216 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.739149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65bc4e5e-758a-43c0-abbc-de79866d22b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.745211 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.745182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrvm\" (UniqueName: \"kubernetes.io/projected/65bc4e5e-758a-43c0-abbc-de79866d22b8-kube-api-access-kcrvm\") pod \"node-exporter-bhvzv\" (UID: \"65bc4e5e-758a-43c0-abbc-de79866d22b8\") " pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.828926 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:37.828490 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bhvzv" Apr 20 15:05:37.839713 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:37.839676 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bc4e5e_758a_43c0_abbc_de79866d22b8.slice/crio-7c9b90fd146f967a67ed54841f4d59bfc1a749a48efa70ea1f5c39ca4cf00ba3 WatchSource:0}: Error finding container 7c9b90fd146f967a67ed54841f4d59bfc1a749a48efa70ea1f5c39ca4cf00ba3: Status 404 returned error can't find the container with id 7c9b90fd146f967a67ed54841f4d59bfc1a749a48efa70ea1f5c39ca4cf00ba3 Apr 20 15:05:38.717037 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:38.716984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bhvzv" event={"ID":"65bc4e5e-758a-43c0-abbc-de79866d22b8","Type":"ContainerStarted","Data":"7c9b90fd146f967a67ed54841f4d59bfc1a749a48efa70ea1f5c39ca4cf00ba3"} Apr 20 15:05:41.895628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.895585 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-f684b8c45-5dpm6"] Apr 20 15:05:41.898979 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.898954 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902125 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902212 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-52b1kqmdi2m4f\"" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902272 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902272 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-28r72\"" Apr 20 15:05:41.902393 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.902215 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 15:05:41.907261 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.907238 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f684b8c45-5dpm6"] Apr 20 15:05:41.971376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szt4s\" (UniqueName: \"kubernetes.io/projected/58db9f7d-153a-4038-be2f-db349004a778-kube-api-access-szt4s\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-metrics-server-audit-profiles\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-tls\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-client-certs\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58db9f7d-153a-4038-be2f-db349004a778-audit-log\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:41.971770 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:41.971619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-client-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.072612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58db9f7d-153a-4038-be2f-db349004a778-audit-log\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.072612 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-client-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.072871 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.072871 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szt4s\" (UniqueName: \"kubernetes.io/projected/58db9f7d-153a-4038-be2f-db349004a778-kube-api-access-szt4s\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.072871 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-metrics-server-audit-profiles\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.073008 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-tls\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.073008 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.072924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-client-certs\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.073111 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.073059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58db9f7d-153a-4038-be2f-db349004a778-audit-log\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.073814 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.073788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.074138 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.074089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58db9f7d-153a-4038-be2f-db349004a778-metrics-server-audit-profiles\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.075947 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.075903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-client-certs\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.076238 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.076215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-client-ca-bundle\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.076238 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.076228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58db9f7d-153a-4038-be2f-db349004a778-secret-metrics-server-tls\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.081225 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.081203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szt4s\" (UniqueName: \"kubernetes.io/projected/58db9f7d-153a-4038-be2f-db349004a778-kube-api-access-szt4s\") pod \"metrics-server-f684b8c45-5dpm6\" (UID: \"58db9f7d-153a-4038-be2f-db349004a778\") " pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.213779 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.213690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:05:42.282025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.281979 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx"] Apr 20 15:05:42.285507 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.285483 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:42.287747 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.287715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 15:05:42.287881 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.287817 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zjfzr\"" Apr 20 15:05:42.292861 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.292812 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx"] Apr 20 15:05:42.375301 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.375250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96aa1153-b6d1-41da-b998-8cb9a9e6f56a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f2vbx\" (UID: \"96aa1153-b6d1-41da-b998-8cb9a9e6f56a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:42.475946 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.475906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96aa1153-b6d1-41da-b998-8cb9a9e6f56a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f2vbx\" (UID: \"96aa1153-b6d1-41da-b998-8cb9a9e6f56a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:42.478817 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.478784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96aa1153-b6d1-41da-b998-8cb9a9e6f56a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f2vbx\" (UID: \"96aa1153-b6d1-41da-b998-8cb9a9e6f56a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:42.598548 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.598506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:42.929580 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.929496 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:05:42.932975 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.932952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.935305 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.935264 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 15:05:42.936109 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.936072 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 15:05:42.936301 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.936164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 15:05:42.936301 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.936229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 15:05:42.936451 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.936172 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 15:05:42.936451 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.936333 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6cr2p\"" Apr 20 15:05:42.940992 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.940945 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 15:05:42.942659 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.942353 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:05:42.981387 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981566 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981566 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981566 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981566 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981566 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:42.981771 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:42.981591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wg2\" (UniqueName: \"kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082269 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082478 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082478 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wg2\" (UniqueName: \"kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082478 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082645 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082645 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.082645 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.082622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.083458 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.083404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.083593 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.083460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.083593 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.083517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.083593 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.083573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.086565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.086541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.086673 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.086544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.090813 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.090792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wg2\" (UniqueName: \"kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2\") pod \"console-77c4645b68-cvtsr\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:43.245213 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:43.245175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:45.717198 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.717163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f684b8c45-5dpm6"] Apr 20 15:05:45.721272 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:45.721246 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58db9f7d_153a_4038_be2f_db349004a778.slice/crio-52cc4006764914f70b47ec9c622c9e70154e04ccb477161c06f57373f25deab9 WatchSource:0}: Error finding container 52cc4006764914f70b47ec9c622c9e70154e04ccb477161c06f57373f25deab9: Status 404 returned error can't find the container with id 52cc4006764914f70b47ec9c622c9e70154e04ccb477161c06f57373f25deab9 Apr 20 15:05:45.739313 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.739254 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bhvzv" event={"ID":"65bc4e5e-758a-43c0-abbc-de79866d22b8","Type":"ContainerStarted","Data":"5452cf40c97c7bdb5da951c77c2bc459b2bd155e0b4d1a5daebc465b04482b14"} Apr 20 15:05:45.740898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.740868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" event={"ID":"58db9f7d-153a-4038-be2f-db349004a778","Type":"ContainerStarted","Data":"52cc4006764914f70b47ec9c622c9e70154e04ccb477161c06f57373f25deab9"} Apr 20 15:05:45.742243 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.742214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c2qbw" event={"ID":"9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41","Type":"ContainerStarted","Data":"61d9eccf0ba632c71c05467dd06b4a878f3e6733f665d3352e9f5ae385641915"} Apr 20 15:05:45.742453 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.742434 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:45.743680 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.743647 2577 patch_prober.go:28] interesting pod/downloads-6bcc868b7-c2qbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" start-of-body= Apr 20 15:05:45.743784 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.743710 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-c2qbw" podUID="9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" Apr 20 15:05:45.771060 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.770993 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-c2qbw" podStartSLOduration=0.919843424 podStartE2EDuration="17.77097091s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.793459666 +0000 UTC m=+179.229003799" lastFinishedPulling="2026-04-20 15:05:45.644587141 +0000 UTC m=+196.080131285" observedRunningTime="2026-04-20 15:05:45.769341701 +0000 UTC m=+196.204885860" watchObservedRunningTime="2026-04-20 15:05:45.77097091 +0000 UTC m=+196.206515065" Apr 20 15:05:45.935113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.934913 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx"] Apr 20 15:05:45.937405 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:45.937377 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:05:45.938558 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:45.938484 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96aa1153_b6d1_41da_b998_8cb9a9e6f56a.slice/crio-66fc0d885c439b74e7a9f6faac633a23662f6cd0079d7bbdf9ce3e0e8ce8b42f WatchSource:0}: Error finding container 66fc0d885c439b74e7a9f6faac633a23662f6cd0079d7bbdf9ce3e0e8ce8b42f: Status 404 returned error can't find the container with id 66fc0d885c439b74e7a9f6faac633a23662f6cd0079d7bbdf9ce3e0e8ce8b42f Apr 20 15:05:45.940963 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:45.940935 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43428ca4_798e_4aea_98e6_809f01256074.slice/crio-e8bf10e5691eff146102c122d164f05029abb3ad58303bb05c3d895c023e072b WatchSource:0}: Error finding container e8bf10e5691eff146102c122d164f05029abb3ad58303bb05c3d895c023e072b: Status 404 returned error can't find the container with id e8bf10e5691eff146102c122d164f05029abb3ad58303bb05c3d895c023e072b Apr 20 15:05:46.344210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.344174 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:05:46.348375 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.348009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.357052 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.357023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:05:46.419864 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.419592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.420598 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.420116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.420598 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.420166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.420598 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.420194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.421353 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.421053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.421353 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.421125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2pxh\" (UniqueName: \"kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.421353 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.421188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.522543 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.522443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.522543 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.522529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.522543 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.522557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.522915 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.522588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.522915 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.522616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.523332 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.523271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.523532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.523510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.523614 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.523562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.523614 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.523571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2pxh\" (UniqueName: \"kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.523936 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.523511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.524242 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.524079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.527553 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.527507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.529519 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.529373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.532625 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.532580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2pxh\" (UniqueName: \"kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh\") pod \"console-86cdf84568-mc2rg\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.665436 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.664886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:46.760171 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.760074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c4645b68-cvtsr" event={"ID":"43428ca4-798e-4aea-98e6-809f01256074","Type":"ContainerStarted","Data":"e8bf10e5691eff146102c122d164f05029abb3ad58303bb05c3d895c023e072b"} Apr 20 15:05:46.763442 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.762468 2577 generic.go:358] "Generic (PLEG): container finished" podID="65bc4e5e-758a-43c0-abbc-de79866d22b8" containerID="5452cf40c97c7bdb5da951c77c2bc459b2bd155e0b4d1a5daebc465b04482b14" exitCode=0 Apr 20 15:05:46.763442 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.762540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bhvzv" event={"ID":"65bc4e5e-758a-43c0-abbc-de79866d22b8","Type":"ContainerDied","Data":"5452cf40c97c7bdb5da951c77c2bc459b2bd155e0b4d1a5daebc465b04482b14"} Apr 20 15:05:46.768087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.766967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" event={"ID":"96aa1153-b6d1-41da-b998-8cb9a9e6f56a","Type":"ContainerStarted","Data":"66fc0d885c439b74e7a9f6faac633a23662f6cd0079d7bbdf9ce3e0e8ce8b42f"} Apr 20 15:05:46.778234 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.778179 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-c2qbw" Apr 20 15:05:46.879981 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:46.879940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:05:46.887545 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:05:46.887463 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17be87f1_b077_409b_803a_3dfe473a0aa7.slice/crio-53c7a8981a6455cfb3693209cbcf1acd63a457aad8899a7160011b9ee96570aa WatchSource:0}: Error finding container 53c7a8981a6455cfb3693209cbcf1acd63a457aad8899a7160011b9ee96570aa: Status 404 returned error can't find the container with id 53c7a8981a6455cfb3693209cbcf1acd63a457aad8899a7160011b9ee96570aa Apr 20 15:05:47.772031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:47.771980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bhvzv" event={"ID":"65bc4e5e-758a-43c0-abbc-de79866d22b8","Type":"ContainerStarted","Data":"d29e573f0436d99f92a1101348d7ada959d6dc69f27b4a8aa8ba4898b471b3c1"} Apr 20 15:05:47.772031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:47.772024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bhvzv" event={"ID":"65bc4e5e-758a-43c0-abbc-de79866d22b8","Type":"ContainerStarted","Data":"15815a58369ae3f497229ba80dc35c1f010a2192594c77a40bb6c401e2b05251"} Apr 20 15:05:47.774035 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:47.773992 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86cdf84568-mc2rg" event={"ID":"17be87f1-b077-409b-803a-3dfe473a0aa7","Type":"ContainerStarted","Data":"53c7a8981a6455cfb3693209cbcf1acd63a457aad8899a7160011b9ee96570aa"} Apr 20 15:05:47.793564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:47.793499 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bhvzv" podStartSLOduration=3.072449505 podStartE2EDuration="10.793480061s" podCreationTimestamp="2026-04-20 15:05:37 +0000 UTC" firstStartedPulling="2026-04-20 15:05:37.842404264 +0000 UTC m=+188.277948402" lastFinishedPulling="2026-04-20 15:05:45.563434812 +0000 UTC m=+195.998978958" observedRunningTime="2026-04-20 15:05:47.79172639 +0000 UTC m=+198.227270534" watchObservedRunningTime="2026-04-20 15:05:47.793480061 +0000 UTC m=+198.229024214" Apr 20 15:05:48.633451 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:48.633329 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:05:48.633451 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:48.633385 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:05:50.691213 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.691169 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:05:50.691698 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.691239 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:05:50.786694 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.786634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c4645b68-cvtsr" event={"ID":"43428ca4-798e-4aea-98e6-809f01256074","Type":"ContainerStarted","Data":"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f"} Apr 20 15:05:50.788371 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.788337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" event={"ID":"58db9f7d-153a-4038-be2f-db349004a778","Type":"ContainerStarted","Data":"4d52f6b6bf72ed6456e94ab2e41f6d5835c4bcd64c0d4536030a83b0e0b29217"} Apr 20 15:05:50.789850 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.789810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" event={"ID":"96aa1153-b6d1-41da-b998-8cb9a9e6f56a","Type":"ContainerStarted","Data":"3c92179400a3549f4b029b386fdf90088341a03de005014cbfb6ad614229b520"} Apr 20 15:05:50.790111 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.790080 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:50.792121 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.792094 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86cdf84568-mc2rg" event={"ID":"17be87f1-b077-409b-803a-3dfe473a0aa7","Type":"ContainerStarted","Data":"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8"} Apr 20 15:05:50.795885 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.795863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" Apr 20 15:05:50.807633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.807576 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77c4645b68-cvtsr" podStartSLOduration=4.375633067 podStartE2EDuration="8.807557484s" podCreationTimestamp="2026-04-20 15:05:42 +0000 UTC" firstStartedPulling="2026-04-20 15:05:45.942985286 +0000 UTC m=+196.378529419" lastFinishedPulling="2026-04-20 15:05:50.374909701 +0000 UTC m=+200.810453836" observedRunningTime="2026-04-20 15:05:50.806381075 +0000 UTC m=+201.241925229" watchObservedRunningTime="2026-04-20 15:05:50.807557484 +0000 UTC m=+201.243101643" Apr 20 15:05:50.833635 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.833570 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f2vbx" podStartSLOduration=4.407013083 podStartE2EDuration="8.833550051s" podCreationTimestamp="2026-04-20 15:05:42 +0000 UTC" firstStartedPulling="2026-04-20 15:05:45.94089272 +0000 UTC m=+196.376436867" lastFinishedPulling="2026-04-20 15:05:50.367429699 +0000 UTC m=+200.802973835" observedRunningTime="2026-04-20 15:05:50.831615807 +0000 UTC m=+201.267159961" watchObservedRunningTime="2026-04-20 15:05:50.833550051 +0000 UTC m=+201.269094219" Apr 20 15:05:50.910363 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.910128 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" podStartSLOduration=5.265722949 podStartE2EDuration="9.910107417s" podCreationTimestamp="2026-04-20 15:05:41 +0000 UTC" firstStartedPulling="2026-04-20 15:05:45.723042364 +0000 UTC m=+196.158586500" lastFinishedPulling="2026-04-20 15:05:50.367426832 +0000 UTC m=+200.802970968" observedRunningTime="2026-04-20 15:05:50.908721153 +0000 UTC m=+201.344265321" watchObservedRunningTime="2026-04-20 15:05:50.910107417 +0000 UTC m=+201.345651572" Apr 20 15:05:50.914311 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:50.910691 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86cdf84568-mc2rg" podStartSLOduration=1.266359188 podStartE2EDuration="4.910675949s" podCreationTimestamp="2026-04-20 15:05:46 +0000 UTC" firstStartedPulling="2026-04-20 15:05:46.891505519 +0000 UTC m=+197.327049667" lastFinishedPulling="2026-04-20 15:05:50.535822282 +0000 UTC m=+200.971366428" observedRunningTime="2026-04-20 15:05:50.876897152 +0000 UTC m=+201.312441325" watchObservedRunningTime="2026-04-20 15:05:50.910675949 +0000 UTC m=+201.346220104" Apr 20 15:05:53.245503 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:53.245464 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:53.246077 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:53.245617 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:05:53.247376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:53.247350 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:05:53.247510 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:53.247441 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:05:56.666483 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:56.666445 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:56.666483 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:56.666487 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:05:56.667968 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:56.667942 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:05:56.668093 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:56.668029 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:05:58.631739 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:58.631699 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:05:58.632180 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:05:58.631761 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:00.691074 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:00.691036 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:00.691569 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:00.691100 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:02.214368 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:02.214329 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:06:02.214368 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:02.214376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:06:03.246403 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:03.246364 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:03.246852 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:03.246426 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:06.666239 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:06.666146 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:06.666239 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:06.666205 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:08.633754 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.633186 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:08.633754 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.633266 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:08.633754 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.633350 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:06:08.634357 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.634095 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325"} pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" containerMessage="Container registry failed liveness probe, will be restarted" Apr 20 15:06:08.639190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.639160 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:08.639349 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:08.639209 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:13.245945 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:13.245908 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:13.246383 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:13.245963 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:15.873429 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:15.873394 2577 generic.go:358] "Generic (PLEG): container finished" podID="23ae8320-d3f3-4b75-8ae2-136783ad218b" containerID="3f5778dff576eff6538431f7fa883057a1d6003bdd884043ed92d79c2a4a7f9e" exitCode=0 Apr 20 15:06:15.873904 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:15.873472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" event={"ID":"23ae8320-d3f3-4b75-8ae2-136783ad218b","Type":"ContainerDied","Data":"3f5778dff576eff6538431f7fa883057a1d6003bdd884043ed92d79c2a4a7f9e"} Apr 20 15:06:15.873904 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:15.873893 2577 scope.go:117] "RemoveContainer" containerID="3f5778dff576eff6538431f7fa883057a1d6003bdd884043ed92d79c2a4a7f9e" Apr 20 15:06:16.666789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:16.666755 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:16.666970 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:16.666824 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:16.878347 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:16.878309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mgv7v" event={"ID":"23ae8320-d3f3-4b75-8ae2-136783ad218b","Type":"ContainerStarted","Data":"6e4264a5a79f81b6e4093759025eef52edb5ff88770cc3c015a9a7e4c270193f"} Apr 20 15:06:18.637187 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:18.637151 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:18.637586 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:18.637203 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:21.183907 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:21.183869 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-f684b8c45-5dpm6_58db9f7d-153a-4038-be2f-db349004a778/metrics-server/0.log" Apr 20 15:06:21.383765 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:21.383734 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-f2vbx_96aa1153-b6d1-41da-b998-8cb9a9e6f56a/monitoring-plugin/0.log" Apr 20 15:06:22.184329 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.184271 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/init-textfile/0.log" Apr 20 15:06:22.219201 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.219170 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:06:22.223030 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.223006 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f684b8c45-5dpm6" Apr 20 15:06:22.384787 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.384705 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/node-exporter/0.log" Apr 20 15:06:22.584547 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.584517 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/kube-rbac-proxy/0.log" Apr 20 15:06:22.900422 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.900385 2577 generic.go:358] "Generic (PLEG): container finished" podID="fd391b00-3f8a-4173-b08a-659c434b7b1a" containerID="76eead5349f19a46451dfad26486f21c3feac0f1165a9273e6a99ec754f2fd1e" exitCode=0 Apr 20 15:06:22.900611 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.900461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" event={"ID":"fd391b00-3f8a-4173-b08a-659c434b7b1a","Type":"ContainerDied","Data":"76eead5349f19a46451dfad26486f21c3feac0f1165a9273e6a99ec754f2fd1e"} Apr 20 15:06:22.901099 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:22.901074 2577 scope.go:117] "RemoveContainer" containerID="76eead5349f19a46451dfad26486f21c3feac0f1165a9273e6a99ec754f2fd1e" Apr 20 15:06:23.245968 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:23.245926 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:23.246487 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:23.245988 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:23.904846 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:23.904811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c6d9q" event={"ID":"fd391b00-3f8a-4173-b08a-659c434b7b1a","Type":"ContainerStarted","Data":"207376249c0b065cc68d77263c2059670bd504ce8ce077c7848490e53b963aa0"} Apr 20 15:06:25.385320 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:25.385269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fdghk_c85d567e-27dc-4cfa-bcf6-f4b3181c4e62/prometheus-operator/0.log" Apr 20 15:06:25.584834 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:25.584800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fdghk_c85d567e-27dc-4cfa-bcf6-f4b3181c4e62/kube-rbac-proxy/0.log" Apr 20 15:06:25.783685 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:25.783655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xkh5p_62be979d-08db-4830-a936-48380c484f67/prometheus-operator-admission-webhook/0.log" Apr 20 15:06:26.666759 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:26.666723 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:26.667240 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:26.666783 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:28.384111 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:28.384083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77c4645b68-cvtsr_43428ca4-798e-4aea-98e6-809f01256074/console/0.log" Apr 20 15:06:28.585498 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:28.585460 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86cdf84568-mc2rg_17be87f1-b077-409b-803a-3dfe473a0aa7/console/0.log" Apr 20 15:06:28.637219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:28.637136 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:28.637219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:28.637199 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:28.785489 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:28.785451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-c2qbw_9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41/download-server/0.log" Apr 20 15:06:30.183711 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:30.183681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc5nc_423472ab-8267-4409-9aa0-f1d4a9c14e79/dns-node-resolver/0.log" Apr 20 15:06:30.984585 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:30.984552 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b959fb7fc-fmqvl_36c8144f-abb6-49c5-b926-7737d427688b/registry/0.log" Apr 20 15:06:31.183936 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:31.183908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7vhjb_3b19ab41-5f62-4594-9262-8789718fb9e9/node-ca/0.log" Apr 20 15:06:31.784558 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:31.784527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8649c78dc4-vwfc7_a56509b8-31c7-4b3a-b397-2eff1d2f128c/router/0.log" Apr 20 15:06:33.245706 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:33.245671 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:33.246110 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:33.245729 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:33.656320 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:33.656210 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" containerID="cri-o://27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325" gracePeriod=30 Apr 20 15:06:34.939661 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:34.939624 2577 generic.go:358] "Generic (PLEG): container finished" podID="36c8144f-abb6-49c5-b926-7737d427688b" containerID="27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325" exitCode=0 Apr 20 15:06:34.940121 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:34.939711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerDied","Data":"27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325"} Apr 20 15:06:34.940121 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:34.939756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerStarted","Data":"fcdd75244d0ae375d2bff5106b465c4e4c77d977076de1c899aa9ee0eb449881"} Apr 20 15:06:34.940121 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:34.939788 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:06:36.666565 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:36.666521 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:36.666970 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:36.666580 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:41.930913 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:41.930808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:06:41.933395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:41.933370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5987592a-660d-4466-bbc4-5bd812cca838-metrics-certs\") pod \"network-metrics-daemon-dp887\" (UID: \"5987592a-660d-4466-bbc4-5bd812cca838\") " pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:06:42.195172 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:42.195092 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:06:42.203207 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:42.203188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dp887" Apr 20 15:06:42.325314 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:42.325203 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dp887"] Apr 20 15:06:42.328091 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:06:42.328060 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5987592a_660d_4466_bbc4_5bd812cca838.slice/crio-e66cd80e43a02c63fdbba4a4d96d8e9451b91780373a824b7b580a0dc2d3d2c0 WatchSource:0}: Error finding container e66cd80e43a02c63fdbba4a4d96d8e9451b91780373a824b7b580a0dc2d3d2c0: Status 404 returned error can't find the container with id e66cd80e43a02c63fdbba4a4d96d8e9451b91780373a824b7b580a0dc2d3d2c0 Apr 20 15:06:42.964716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:42.964677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dp887" event={"ID":"5987592a-660d-4466-bbc4-5bd812cca838","Type":"ContainerStarted","Data":"e66cd80e43a02c63fdbba4a4d96d8e9451b91780373a824b7b580a0dc2d3d2c0"} Apr 20 15:06:43.245576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:43.245547 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:43.245694 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:43.245596 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:43.970349 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:43.970311 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dp887" event={"ID":"5987592a-660d-4466-bbc4-5bd812cca838","Type":"ContainerStarted","Data":"1aa58426d762f69a5cdbb39674874c800f1d99556959646e88c38dc3263598f5"} Apr 20 15:06:43.970349 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:43.970349 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dp887" event={"ID":"5987592a-660d-4466-bbc4-5bd812cca838","Type":"ContainerStarted","Data":"c42ef1a18d794e157be59c5fef0d09719422d8319f47f94aa101ea0b1e39e30b"} Apr 20 15:06:43.986804 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:43.986742 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dp887" podStartSLOduration=253.081182778 podStartE2EDuration="4m13.986720611s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:06:42.329814782 +0000 UTC m=+252.765358928" lastFinishedPulling="2026-04-20 15:06:43.235352621 +0000 UTC m=+253.670896761" observedRunningTime="2026-04-20 15:06:43.985900415 +0000 UTC m=+254.421444569" watchObservedRunningTime="2026-04-20 15:06:43.986720611 +0000 UTC m=+254.422264766" Apr 20 15:06:46.666762 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:46.666723 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:46.667175 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:46.666780 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:48.633016 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:48.632978 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:48.633410 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:48.633040 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:53.246427 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:53.246381 2577 patch_prober.go:28] interesting pod/console-77c4645b68-cvtsr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" start-of-body= Apr 20 15:06:53.246820 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:53.246449 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" probeResult="failure" output="Get \"https://10.134.0.20:8443/health\": dial tcp 10.134.0.20:8443: connect: connection refused" Apr 20 15:06:55.749326 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.749271 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:06:55.781424 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.781390 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:06:55.785655 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.785630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.796508 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.796482 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:06:55.848132 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848350 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848350 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848350 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848350 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848494 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48b8v\" (UniqueName: \"kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.848494 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.848408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.946781 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.946750 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:55.946953 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.946804 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:55.948813 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.948793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.948862 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.948828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.948862 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.948850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949054 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949106 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949179 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949233 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48b8v\" (UniqueName: \"kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949766 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949766 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.949958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.949876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.951336 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.951316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.951446 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.951418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:55.961252 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:55.961227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48b8v\" (UniqueName: \"kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v\") pod \"console-5d9bf87884-scbxr\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:56.094974 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:56.094877 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:06:56.217845 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:56.217793 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:06:56.220003 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:06:56.219975 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6789ba89_9e79_4d97_aa2d_d3ad018fe5fa.slice/crio-58f631cf951b2138ba39e3ec42d195761bc66eae68dc7b8d7e12954c845385ef WatchSource:0}: Error finding container 58f631cf951b2138ba39e3ec42d195761bc66eae68dc7b8d7e12954c845385ef: Status 404 returned error can't find the container with id 58f631cf951b2138ba39e3ec42d195761bc66eae68dc7b8d7e12954c845385ef Apr 20 15:06:56.666429 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:56.666393 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:06:56.666610 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:56.666450 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:06:57.011539 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:57.011490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf87884-scbxr" event={"ID":"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa","Type":"ContainerStarted","Data":"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9"} Apr 20 15:06:57.011922 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:57.011546 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf87884-scbxr" event={"ID":"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa","Type":"ContainerStarted","Data":"58f631cf951b2138ba39e3ec42d195761bc66eae68dc7b8d7e12954c845385ef"} Apr 20 15:06:57.036589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:57.036524 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d9bf87884-scbxr" podStartSLOduration=2.036503186 podStartE2EDuration="2.036503186s" podCreationTimestamp="2026-04-20 15:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:06:57.031708222 +0000 UTC m=+267.467252376" watchObservedRunningTime="2026-04-20 15:06:57.036503186 +0000 UTC m=+267.472047342" Apr 20 15:06:58.632098 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:58.632061 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:06:58.632487 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:06:58.632123 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:07:05.947789 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:05.947748 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:07:05.948268 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:05.947822 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:07:06.095936 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.095889 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:07:06.096124 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.096006 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:07:06.097218 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.097195 2577 patch_prober.go:28] interesting pod/console-5d9bf87884-scbxr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.22:8443/health\": dial tcp 10.134.0.22:8443: connect: connection refused" start-of-body= Apr 20 15:07:06.097348 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.097239 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5d9bf87884-scbxr" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" probeResult="failure" output="Get \"https://10.134.0.22:8443/health\": dial tcp 10.134.0.22:8443: connect: connection refused" Apr 20 15:07:06.666721 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.666632 2577 patch_prober.go:28] interesting pod/console-86cdf84568-mc2rg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 20 15:07:06.666721 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:06.666687 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 20 15:07:08.632397 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.632361 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:07:08.632881 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.632432 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:07:08.632881 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.632477 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:07:08.633089 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.633057 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"fcdd75244d0ae375d2bff5106b465c4e4c77d977076de1c899aa9ee0eb449881"} pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" containerMessage="Container registry failed liveness probe, will be restarted" Apr 20 15:07:08.636840 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.636816 2577 patch_prober.go:28] interesting pod/image-registry-b959fb7fc-fmqvl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:07:08.636961 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:08.636863 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:07:09.613780 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:07:09.613730 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7rhcs" podUID="cbcbf670-2941-48c2-8a4a-b5f253135d10" Apr 20 15:07:09.613780 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:07:09.613735 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rwbpz" podUID="548662da-3b01-418f-b71e-7805525a03e5" Apr 20 15:07:10.049087 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:10.049054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:07:10.049479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:10.049054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:13.000266 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.000222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:13.000762 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.000302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:07:13.002797 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.002763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/548662da-3b01-418f-b71e-7805525a03e5-metrics-tls\") pod \"dns-default-rwbpz\" (UID: \"548662da-3b01-418f-b71e-7805525a03e5\") " pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:13.002915 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.002814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbcbf670-2941-48c2-8a4a-b5f253135d10-cert\") pod \"ingress-canary-7rhcs\" (UID: \"cbcbf670-2941-48c2-8a4a-b5f253135d10\") " pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:07:13.052042 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.052009 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:07:13.052703 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.052686 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:07:13.060100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.060078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7rhcs" Apr 20 15:07:13.060222 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.060176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:13.201568 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.201540 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7rhcs"] Apr 20 15:07:13.204244 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:07:13.204214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbcbf670_2941_48c2_8a4a_b5f253135d10.slice/crio-2db15729ddda9d81d4ea9fd57222939bd3e7d66188f524320306937397a3ec48 WatchSource:0}: Error finding container 2db15729ddda9d81d4ea9fd57222939bd3e7d66188f524320306937397a3ec48: Status 404 returned error can't find the container with id 2db15729ddda9d81d4ea9fd57222939bd3e7d66188f524320306937397a3ec48 Apr 20 15:07:13.217521 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:13.217493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwbpz"] Apr 20 15:07:13.220197 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:07:13.220163 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod548662da_3b01_418f_b71e_7805525a03e5.slice/crio-40b6a958f7318f787beb0334cba1e9894c79e8df381cccc0944016e701e4c1a7 WatchSource:0}: Error finding container 40b6a958f7318f787beb0334cba1e9894c79e8df381cccc0944016e701e4c1a7: Status 404 returned error can't find the container with id 40b6a958f7318f787beb0334cba1e9894c79e8df381cccc0944016e701e4c1a7 Apr 20 15:07:14.062007 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:14.061944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7rhcs" event={"ID":"cbcbf670-2941-48c2-8a4a-b5f253135d10","Type":"ContainerStarted","Data":"2db15729ddda9d81d4ea9fd57222939bd3e7d66188f524320306937397a3ec48"} Apr 20 15:07:14.063483 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:14.063447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwbpz" event={"ID":"548662da-3b01-418f-b71e-7805525a03e5","Type":"ContainerStarted","Data":"40b6a958f7318f787beb0334cba1e9894c79e8df381cccc0944016e701e4c1a7"} Apr 20 15:07:16.072716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.072677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7rhcs" event={"ID":"cbcbf670-2941-48c2-8a4a-b5f253135d10","Type":"ContainerStarted","Data":"9dc6d42fca885bbc039800e619ed473c51edfec39d48214d8542d775741d221d"} Apr 20 15:07:16.074347 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.074321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwbpz" event={"ID":"548662da-3b01-418f-b71e-7805525a03e5","Type":"ContainerStarted","Data":"61436adc4fd7db9487f199d1fcbb8f8bb759f1494aa56888c704c065880d3068"} Apr 20 15:07:16.074347 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.074350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwbpz" event={"ID":"548662da-3b01-418f-b71e-7805525a03e5","Type":"ContainerStarted","Data":"22d02c7dfac10e73c7f2d74bda31b206b8a40ea03dcbf2268dbfa380c1415447"} Apr 20 15:07:16.074507 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.074462 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:16.096673 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.096636 2577 patch_prober.go:28] interesting pod/console-5d9bf87884-scbxr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.22:8443/health\": dial tcp 10.134.0.22:8443: connect: connection refused" start-of-body= Apr 20 15:07:16.096860 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.096695 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5d9bf87884-scbxr" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" probeResult="failure" output="Get \"https://10.134.0.22:8443/health\": dial tcp 10.134.0.22:8443: connect: connection refused" Apr 20 15:07:16.104700 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.104637 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rwbpz" podStartSLOduration=252.311285884 podStartE2EDuration="4m14.10461674s" podCreationTimestamp="2026-04-20 15:03:02 +0000 UTC" firstStartedPulling="2026-04-20 15:07:13.222069587 +0000 UTC m=+283.657613720" lastFinishedPulling="2026-04-20 15:07:15.015400436 +0000 UTC m=+285.450944576" observedRunningTime="2026-04-20 15:07:16.102756304 +0000 UTC m=+286.538300459" watchObservedRunningTime="2026-04-20 15:07:16.10461674 +0000 UTC m=+286.540160897" Apr 20 15:07:16.105102 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.105061 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7rhcs" podStartSLOduration=252.292111231 podStartE2EDuration="4m14.105049174s" podCreationTimestamp="2026-04-20 15:03:02 +0000 UTC" firstStartedPulling="2026-04-20 15:07:13.20614025 +0000 UTC m=+283.641684382" lastFinishedPulling="2026-04-20 15:07:15.01907818 +0000 UTC m=+285.454622325" observedRunningTime="2026-04-20 15:07:16.086925574 +0000 UTC m=+286.522469730" watchObservedRunningTime="2026-04-20 15:07:16.105049174 +0000 UTC m=+286.540593329" Apr 20 15:07:16.670452 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.670418 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:07:16.674375 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:16.674345 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:07:18.637505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:18.637474 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:07:20.768648 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:20.768602 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77c4645b68-cvtsr" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" containerID="cri-o://088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f" gracePeriod=15 Apr 20 15:07:21.010930 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.010906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77c4645b68-cvtsr_43428ca4-798e-4aea-98e6-809f01256074/console/0.log" Apr 20 15:07:21.011069 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.010978 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:07:21.095869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095798 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77c4645b68-cvtsr_43428ca4-798e-4aea-98e6-809f01256074/console/0.log" Apr 20 15:07:21.095869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095842 2577 generic.go:358] "Generic (PLEG): container finished" podID="43428ca4-798e-4aea-98e6-809f01256074" containerID="088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f" exitCode=2 Apr 20 15:07:21.096144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c4645b68-cvtsr" event={"ID":"43428ca4-798e-4aea-98e6-809f01256074","Type":"ContainerDied","Data":"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f"} Apr 20 15:07:21.096144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c4645b68-cvtsr" event={"ID":"43428ca4-798e-4aea-98e6-809f01256074","Type":"ContainerDied","Data":"e8bf10e5691eff146102c122d164f05029abb3ad58303bb05c3d895c023e072b"} Apr 20 15:07:21.096144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095930 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c4645b68-cvtsr" Apr 20 15:07:21.096144 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.095948 2577 scope.go:117] "RemoveContainer" containerID="088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f" Apr 20 15:07:21.103931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.103907 2577 scope.go:117] "RemoveContainer" containerID="088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f" Apr 20 15:07:21.104257 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:07:21.104226 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f\": container with ID starting with 088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f not found: ID does not exist" containerID="088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f" Apr 20 15:07:21.104318 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.104274 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f"} err="failed to get container status \"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f\": rpc error: code = NotFound desc = could not find container \"088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f\": container with ID starting with 088697d48783364ad995457fd55818e174594f4288bffd71b3ef040d0f58356f not found: ID does not exist" Apr 20 15:07:21.174853 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.174815 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wg2\" (UniqueName: \"kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.174853 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.174863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175104 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.174896 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175104 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.174921 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175104 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175015 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175104 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175080 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175334 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175115 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert\") pod \"43428ca4-798e-4aea-98e6-809f01256074\" (UID: \"43428ca4-798e-4aea-98e6-809f01256074\") " Apr 20 15:07:21.175399 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175363 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:21.175399 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175375 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config" (OuterVolumeSpecName: "console-config") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:21.175399 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175385 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:21.175532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.175400 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca" (OuterVolumeSpecName: "service-ca") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:21.177353 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.177325 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:21.177490 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.177370 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:21.177532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.177508 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2" (OuterVolumeSpecName: "kube-api-access-z4wg2") pod "43428ca4-798e-4aea-98e6-809f01256074" (UID: "43428ca4-798e-4aea-98e6-809f01256074"). InnerVolumeSpecName "kube-api-access-z4wg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:07:21.276455 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276406 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276455 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276451 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4wg2\" (UniqueName: \"kubernetes.io/projected/43428ca4-798e-4aea-98e6-809f01256074-kube-api-access-z4wg2\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276455 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276461 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-oauth-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276455 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276472 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-trusted-ca-bundle\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276481 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-console-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276490 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43428ca4-798e-4aea-98e6-809f01256074-service-ca\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.276735 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.276499 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43428ca4-798e-4aea-98e6-809f01256074-console-oauth-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:21.417466 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.417433 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:07:21.420966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:21.420937 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77c4645b68-cvtsr"] Apr 20 15:07:22.195037 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:22.194997 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43428ca4-798e-4aea-98e6-809f01256074" path="/var/lib/kubelet/pods/43428ca4-798e-4aea-98e6-809f01256074/volumes" Apr 20 15:07:26.082690 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:26.082643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rwbpz" Apr 20 15:07:26.099898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:26.099867 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:07:26.104033 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:26.104005 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:07:26.163894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:26.163856 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:07:30.085230 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:30.085201 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:07:30.085743 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:30.085726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:07:30.093968 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:30.093942 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 15:07:33.651589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:33.651538 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" podUID="36c8144f-abb6-49c5-b926-7737d427688b" containerName="registry" containerID="cri-o://fcdd75244d0ae375d2bff5106b465c4e4c77d977076de1c899aa9ee0eb449881" gracePeriod=30 Apr 20 15:07:34.770175 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:34.770153 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:07:35.136196 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:35.136113 2577 generic.go:358] "Generic (PLEG): container finished" podID="36c8144f-abb6-49c5-b926-7737d427688b" containerID="fcdd75244d0ae375d2bff5106b465c4e4c77d977076de1c899aa9ee0eb449881" exitCode=0 Apr 20 15:07:35.136196 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:35.136164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerDied","Data":"fcdd75244d0ae375d2bff5106b465c4e4c77d977076de1c899aa9ee0eb449881"} Apr 20 15:07:35.136196 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:35.136195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" event={"ID":"36c8144f-abb6-49c5-b926-7737d427688b","Type":"ContainerStarted","Data":"d3ded8a403bbc4e670d38d22d0d1a7c23cf062fb5ffbb6752affe8447a7e7470"} Apr 20 15:07:35.136480 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:35.136216 2577 scope.go:117] "RemoveContainer" containerID="27dcf47d70cf2ba5f3fabf982c3b9113c224f606394619aad246f890c072b325" Apr 20 15:07:35.136480 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:35.136307 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:07:51.184928 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.184883 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86cdf84568-mc2rg" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" containerID="cri-o://fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8" gracePeriod=15 Apr 20 15:07:51.433597 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.433569 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86cdf84568-mc2rg_17be87f1-b077-409b-803a-3dfe473a0aa7/console/0.log" Apr 20 15:07:51.433733 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.433633 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:07:51.530338 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530280 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530338 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530343 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530365 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530408 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530429 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530460 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2pxh\" (UniqueName: \"kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530589 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530528 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert\") pod \"17be87f1-b077-409b-803a-3dfe473a0aa7\" (UID: \"17be87f1-b077-409b-803a-3dfe473a0aa7\") " Apr 20 15:07:51.530958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530930 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:51.530958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530947 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config" (OuterVolumeSpecName: "console-config") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:51.531068 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530960 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca" (OuterVolumeSpecName: "service-ca") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:51.531068 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.530975 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:51.532689 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.532660 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:51.532774 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.532675 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:51.532774 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.532735 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh" (OuterVolumeSpecName: "kube-api-access-k2pxh") pod "17be87f1-b077-409b-803a-3dfe473a0aa7" (UID: "17be87f1-b077-409b-803a-3dfe473a0aa7"). InnerVolumeSpecName "kube-api-access-k2pxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:07:51.631633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631590 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-trusted-ca-bundle\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631622 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-service-ca\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631633 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2pxh\" (UniqueName: \"kubernetes.io/projected/17be87f1-b077-409b-803a-3dfe473a0aa7-kube-api-access-k2pxh\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631633 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631642 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-oauth-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631902 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631651 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17be87f1-b077-409b-803a-3dfe473a0aa7-console-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631902 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631660 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-oauth-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:51.631902 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:51.631671 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17be87f1-b077-409b-803a-3dfe473a0aa7-console-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:07:52.186706 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186679 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86cdf84568-mc2rg_17be87f1-b077-409b-803a-3dfe473a0aa7/console/0.log" Apr 20 15:07:52.187100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186720 2577 generic.go:358] "Generic (PLEG): container finished" podID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerID="fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8" exitCode=2 Apr 20 15:07:52.187100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86cdf84568-mc2rg" event={"ID":"17be87f1-b077-409b-803a-3dfe473a0aa7","Type":"ContainerDied","Data":"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8"} Apr 20 15:07:52.187100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86cdf84568-mc2rg" event={"ID":"17be87f1-b077-409b-803a-3dfe473a0aa7","Type":"ContainerDied","Data":"53c7a8981a6455cfb3693209cbcf1acd63a457aad8899a7160011b9ee96570aa"} Apr 20 15:07:52.187100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186785 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86cdf84568-mc2rg" Apr 20 15:07:52.187100 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.186802 2577 scope.go:117] "RemoveContainer" containerID="fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8" Apr 20 15:07:52.195710 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.195693 2577 scope.go:117] "RemoveContainer" containerID="fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8" Apr 20 15:07:52.195976 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:07:52.195948 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8\": container with ID starting with fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8 not found: ID does not exist" containerID="fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8" Apr 20 15:07:52.196026 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.195987 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8"} err="failed to get container status \"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8\": rpc error: code = NotFound desc = could not find container \"fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8\": container with ID starting with fdddb2f5eb90fbee780b30307019e9e55aab354ab43330775c21533579a75ad8 not found: ID does not exist" Apr 20 15:07:52.207845 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.207815 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:07:52.211630 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:52.211604 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86cdf84568-mc2rg"] Apr 20 15:07:54.195148 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:54.195102 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" path="/var/lib/kubelet/pods/17be87f1-b077-409b-803a-3dfe473a0aa7/volumes" Apr 20 15:07:56.145552 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:07:56.145523 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b959fb7fc-fmqvl" Apr 20 15:08:32.811770 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.811731 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2vwvt"] Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812014 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812024 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812038 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812044 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812096 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="17be87f1-b077-409b-803a-3dfe473a0aa7" containerName="console" Apr 20 15:08:32.812206 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.812104 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="43428ca4-798e-4aea-98e6-809f01256074" containerName="console" Apr 20 15:08:32.814632 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.814615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:32.816854 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.816834 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 15:08:32.822363 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.822334 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2vwvt"] Apr 20 15:08:32.963838 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.963800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eaf0e44b-a5e1-426a-8cc3-36a330c48389-original-pull-secret\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:32.964016 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.963982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-kubelet-config\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:32.964066 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:32.964013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-dbus\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.065351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.065246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-kubelet-config\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.065351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.065298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-dbus\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.065351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.065331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eaf0e44b-a5e1-426a-8cc3-36a330c48389-original-pull-secret\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.065564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.065362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-kubelet-config\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.065564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.065481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eaf0e44b-a5e1-426a-8cc3-36a330c48389-dbus\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.067628 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.067609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eaf0e44b-a5e1-426a-8cc3-36a330c48389-original-pull-secret\") pod \"global-pull-secret-syncer-2vwvt\" (UID: \"eaf0e44b-a5e1-426a-8cc3-36a330c48389\") " pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.123840 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.123801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2vwvt" Apr 20 15:08:33.242906 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.242872 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2vwvt"] Apr 20 15:08:33.246411 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:08:33.246380 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf0e44b_a5e1_426a_8cc3_36a330c48389.slice/crio-6a5bd5f8139b305319eff1d8ecc5aa43b34014b0602bd50c9c80620776c59e10 WatchSource:0}: Error finding container 6a5bd5f8139b305319eff1d8ecc5aa43b34014b0602bd50c9c80620776c59e10: Status 404 returned error can't find the container with id 6a5bd5f8139b305319eff1d8ecc5aa43b34014b0602bd50c9c80620776c59e10 Apr 20 15:08:33.304222 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:33.304184 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2vwvt" event={"ID":"eaf0e44b-a5e1-426a-8cc3-36a330c48389","Type":"ContainerStarted","Data":"6a5bd5f8139b305319eff1d8ecc5aa43b34014b0602bd50c9c80620776c59e10"} Apr 20 15:08:39.323602 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:39.323567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2vwvt" event={"ID":"eaf0e44b-a5e1-426a-8cc3-36a330c48389","Type":"ContainerStarted","Data":"a936a8a5aca40d56fa3f2f05e78a0a9151bb4e3b0a64231b91b423925344d485"} Apr 20 15:08:39.338694 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:08:39.338644 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2vwvt" podStartSLOduration=2.119430682 podStartE2EDuration="7.338626455s" podCreationTimestamp="2026-04-20 15:08:32 +0000 UTC" firstStartedPulling="2026-04-20 15:08:33.247983435 +0000 UTC m=+363.683527567" lastFinishedPulling="2026-04-20 15:08:38.467179209 +0000 UTC m=+368.902723340" observedRunningTime="2026-04-20 15:08:39.337162274 +0000 UTC m=+369.772706428" watchObservedRunningTime="2026-04-20 15:08:39.338626455 +0000 UTC m=+369.774170610" Apr 20 15:09:25.034827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.034788 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg"] Apr 20 15:09:25.038080 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.038061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.042070 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.042046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6hxlq\"" Apr 20 15:09:25.042421 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.042400 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 15:09:25.042475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.042427 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:09:25.051694 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.051665 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg"] Apr 20 15:09:25.095651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.095605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c84b1f3-06a5-4c01-894f-58f8086ada11-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.095651 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.095653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r95q\" (UniqueName: \"kubernetes.io/projected/5c84b1f3-06a5-4c01-894f-58f8086ada11-kube-api-access-6r95q\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.196253 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.196222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c84b1f3-06a5-4c01-894f-58f8086ada11-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.196424 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.196259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r95q\" (UniqueName: \"kubernetes.io/projected/5c84b1f3-06a5-4c01-894f-58f8086ada11-kube-api-access-6r95q\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.196619 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.196598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c84b1f3-06a5-4c01-894f-58f8086ada11-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.207356 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.207319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r95q\" (UniqueName: \"kubernetes.io/projected/5c84b1f3-06a5-4c01-894f-58f8086ada11-kube-api-access-6r95q\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-852gg\" (UID: \"5c84b1f3-06a5-4c01-894f-58f8086ada11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.348057 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.347974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" Apr 20 15:09:25.478165 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:25.478134 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg"] Apr 20 15:09:25.481211 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:09:25.481171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c84b1f3_06a5_4c01_894f_58f8086ada11.slice/crio-2e363dc8585dcdfcb5885f5e31de6eb360cb01321d6313ce083039251cecc5fd WatchSource:0}: Error finding container 2e363dc8585dcdfcb5885f5e31de6eb360cb01321d6313ce083039251cecc5fd: Status 404 returned error can't find the container with id 2e363dc8585dcdfcb5885f5e31de6eb360cb01321d6313ce083039251cecc5fd Apr 20 15:09:26.465137 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:26.465080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" event={"ID":"5c84b1f3-06a5-4c01-894f-58f8086ada11","Type":"ContainerStarted","Data":"2e363dc8585dcdfcb5885f5e31de6eb360cb01321d6313ce083039251cecc5fd"} Apr 20 15:09:28.473656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:28.473623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" event={"ID":"5c84b1f3-06a5-4c01-894f-58f8086ada11","Type":"ContainerStarted","Data":"860c116e514e72691f9e97a9cecbdb30154421fa000ae7c9925b8babd46609c1"} Apr 20 15:09:28.495163 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:28.495106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-852gg" podStartSLOduration=1.099904887 podStartE2EDuration="3.495090824s" podCreationTimestamp="2026-04-20 15:09:25 +0000 UTC" firstStartedPulling="2026-04-20 15:09:25.483713278 +0000 UTC m=+415.919257410" lastFinishedPulling="2026-04-20 15:09:27.8788992 +0000 UTC m=+418.314443347" observedRunningTime="2026-04-20 15:09:28.493245287 +0000 UTC m=+418.928789443" watchObservedRunningTime="2026-04-20 15:09:28.495090824 +0000 UTC m=+418.930634978" Apr 20 15:09:31.849603 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.849567 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mdzfn"] Apr 20 15:09:31.853037 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.853012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:31.855313 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.855268 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 15:09:31.856116 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.856095 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 15:09:31.856243 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.856168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-6q87t\"" Apr 20 15:09:31.862515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.862492 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mdzfn"] Apr 20 15:09:31.957113 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.957079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:31.957315 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:31.957150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfxb\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-kube-api-access-5xfxb\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.058048 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.058011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfxb\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-kube-api-access-5xfxb\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.058243 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.058111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.065861 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.065830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.066359 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.066341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfxb\" (UniqueName: \"kubernetes.io/projected/d70e5908-fb6b-439e-b23b-5eedd54431ac-kube-api-access-5xfxb\") pod \"cert-manager-webhook-597b96b99b-mdzfn\" (UID: \"d70e5908-fb6b-439e-b23b-5eedd54431ac\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.174362 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.174239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:32.296395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.296357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mdzfn"] Apr 20 15:09:32.299388 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:09:32.299361 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70e5908_fb6b_439e_b23b_5eedd54431ac.slice/crio-36a8081a0217c46c6aa4285e401ca4aed2e4fe3d426d969a91ebeb85a94d3729 WatchSource:0}: Error finding container 36a8081a0217c46c6aa4285e401ca4aed2e4fe3d426d969a91ebeb85a94d3729: Status 404 returned error can't find the container with id 36a8081a0217c46c6aa4285e401ca4aed2e4fe3d426d969a91ebeb85a94d3729 Apr 20 15:09:32.485960 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:32.485917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" event={"ID":"d70e5908-fb6b-439e-b23b-5eedd54431ac","Type":"ContainerStarted","Data":"36a8081a0217c46c6aa4285e401ca4aed2e4fe3d426d969a91ebeb85a94d3729"} Apr 20 15:09:35.497325 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:35.497265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" event={"ID":"d70e5908-fb6b-439e-b23b-5eedd54431ac","Type":"ContainerStarted","Data":"f65c53cf5fbfcdc1751f2a5c02a71dc4d108e7add3bde5575adc0ed1f5686a45"} Apr 20 15:09:35.497820 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:35.497341 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:35.513972 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:35.513907 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" podStartSLOduration=1.611160747 podStartE2EDuration="4.513886016s" podCreationTimestamp="2026-04-20 15:09:31 +0000 UTC" firstStartedPulling="2026-04-20 15:09:32.301822659 +0000 UTC m=+422.737366791" lastFinishedPulling="2026-04-20 15:09:35.204547928 +0000 UTC m=+425.640092060" observedRunningTime="2026-04-20 15:09:35.511831879 +0000 UTC m=+425.947376033" watchObservedRunningTime="2026-04-20 15:09:35.513886016 +0000 UTC m=+425.949430172" Apr 20 15:09:41.503103 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:41.503020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-mdzfn" Apr 20 15:09:44.299002 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.298956 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-6x9qg"] Apr 20 15:09:44.301949 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.301926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.303983 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.303960 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-fgc6g\"" Apr 20 15:09:44.310425 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.310401 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6x9qg"] Apr 20 15:09:44.362274 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.362233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxwq\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-kube-api-access-dzxwq\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.362274 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.362280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-bound-sa-token\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.463694 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.463643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxwq\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-kube-api-access-dzxwq\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.463874 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.463707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-bound-sa-token\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.471903 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.471865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-bound-sa-token\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.472035 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.471939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxwq\" (UniqueName: \"kubernetes.io/projected/63cf5dbc-556f-4666-9ab3-020e7e8f6eb2-kube-api-access-dzxwq\") pod \"cert-manager-759f64656b-6x9qg\" (UID: \"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2\") " pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.611987 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.611879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6x9qg" Apr 20 15:09:44.733129 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:44.733090 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6x9qg"] Apr 20 15:09:44.736274 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:09:44.736246 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63cf5dbc_556f_4666_9ab3_020e7e8f6eb2.slice/crio-eaf87c06df3be2868eaf5056725bb3a5b0f03681939e34d32d26aafd3777ae01 WatchSource:0}: Error finding container eaf87c06df3be2868eaf5056725bb3a5b0f03681939e34d32d26aafd3777ae01: Status 404 returned error can't find the container with id eaf87c06df3be2868eaf5056725bb3a5b0f03681939e34d32d26aafd3777ae01 Apr 20 15:09:45.530916 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:45.530882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6x9qg" event={"ID":"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2","Type":"ContainerStarted","Data":"573530ad29fa38f90896227c9fcec9c9436398e51074d3cdcbd13c4f9e630c5b"} Apr 20 15:09:45.530916 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:45.530919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6x9qg" event={"ID":"63cf5dbc-556f-4666-9ab3-020e7e8f6eb2","Type":"ContainerStarted","Data":"eaf87c06df3be2868eaf5056725bb3a5b0f03681939e34d32d26aafd3777ae01"} Apr 20 15:09:45.545264 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:09:45.545064 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-6x9qg" podStartSLOduration=1.545046042 podStartE2EDuration="1.545046042s" podCreationTimestamp="2026-04-20 15:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:09:45.544508961 +0000 UTC m=+435.980053212" watchObservedRunningTime="2026-04-20 15:09:45.545046042 +0000 UTC m=+435.980590198" Apr 20 15:10:04.007949 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.007913 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq"] Apr 20 15:10:04.013654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.013630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.016474 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.016447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:10:04.016608 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.016449 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:10:04.016666 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.016608 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:10:04.016666 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.016606 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:10:04.016840 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.016812 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-td6pf\"" Apr 20 15:10:04.033606 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.033578 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq"] Apr 20 15:10:04.107759 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.107719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.107759 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.107769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.107989 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.107853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gc4x\" (UniqueName: \"kubernetes.io/projected/f586be21-da87-4be6-a898-fc8c56047904-kube-api-access-5gc4x\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.208315 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.208245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gc4x\" (UniqueName: \"kubernetes.io/projected/f586be21-da87-4be6-a898-fc8c56047904-kube-api-access-5gc4x\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.208505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.208335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.208505 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.208366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.210849 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.210813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.210990 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.210899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f586be21-da87-4be6-a898-fc8c56047904-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.217032 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.216992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gc4x\" (UniqueName: \"kubernetes.io/projected/f586be21-da87-4be6-a898-fc8c56047904-kube-api-access-5gc4x\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-282sq\" (UID: \"f586be21-da87-4be6-a898-fc8c56047904\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.324917 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.324817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:04.460521 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.460493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq"] Apr 20 15:10:04.463071 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:10:04.463045 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf586be21_da87_4be6_a898_fc8c56047904.slice/crio-57067a017042391005d07a9e08b907add10250794006259bedac7ec2e037243d WatchSource:0}: Error finding container 57067a017042391005d07a9e08b907add10250794006259bedac7ec2e037243d: Status 404 returned error can't find the container with id 57067a017042391005d07a9e08b907add10250794006259bedac7ec2e037243d Apr 20 15:10:04.600934 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:04.600842 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" event={"ID":"f586be21-da87-4be6-a898-fc8c56047904","Type":"ContainerStarted","Data":"57067a017042391005d07a9e08b907add10250794006259bedac7ec2e037243d"} Apr 20 15:10:07.612835 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:07.612788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" event={"ID":"f586be21-da87-4be6-a898-fc8c56047904","Type":"ContainerStarted","Data":"5f9a29cf11ee798a77769ba48f39f697b6b0a468607a21287653ddd7e082242e"} Apr 20 15:10:07.613308 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:07.612947 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:07.638573 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:07.638517 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" podStartSLOduration=2.023443368 podStartE2EDuration="4.638501919s" podCreationTimestamp="2026-04-20 15:10:03 +0000 UTC" firstStartedPulling="2026-04-20 15:10:04.464944419 +0000 UTC m=+454.900488552" lastFinishedPulling="2026-04-20 15:10:07.080002968 +0000 UTC m=+457.515547103" observedRunningTime="2026-04-20 15:10:07.636881615 +0000 UTC m=+458.072425769" watchObservedRunningTime="2026-04-20 15:10:07.638501919 +0000 UTC m=+458.074046073" Apr 20 15:10:08.759843 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.759803 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-646fk"] Apr 20 15:10:08.763166 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.763146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.771562 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.771528 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:10:08.771700 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.771596 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:10:08.771700 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.771674 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:10:08.772514 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.772465 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:10:08.772514 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.772488 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gxh59\"" Apr 20 15:10:08.772514 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.772506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:10:08.774221 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.774187 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-646fk"] Apr 20 15:10:08.846189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.846147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.846189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.846192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-metrics-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.846454 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.846274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68b109d-9188-4f00-965a-775902235d56-manager-config\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.846454 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.846389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wspsp\" (UniqueName: \"kubernetes.io/projected/e68b109d-9188-4f00-965a-775902235d56-kube-api-access-wspsp\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.947547 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.947495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.947547 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.947546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-metrics-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.947820 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.947583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68b109d-9188-4f00-965a-775902235d56-manager-config\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.947820 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.947642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wspsp\" (UniqueName: \"kubernetes.io/projected/e68b109d-9188-4f00-965a-775902235d56-kube-api-access-wspsp\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.948243 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.948220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68b109d-9188-4f00-965a-775902235d56-manager-config\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.950155 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.950131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-metrics-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.950238 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.950174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68b109d-9188-4f00-965a-775902235d56-cert\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:08.956079 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:08.956049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wspsp\" (UniqueName: \"kubernetes.io/projected/e68b109d-9188-4f00-965a-775902235d56-kube-api-access-wspsp\") pod \"lws-controller-manager-6b8584f779-646fk\" (UID: \"e68b109d-9188-4f00-965a-775902235d56\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:09.072542 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:09.072438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:09.201720 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:09.201692 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-646fk"] Apr 20 15:10:09.204381 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:10:09.204348 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68b109d_9188_4f00_965a_775902235d56.slice/crio-f9d0f42bee65b27ae645b46deb95f8784c5882ace5df65feacc240967a6f8edc WatchSource:0}: Error finding container f9d0f42bee65b27ae645b46deb95f8784c5882ace5df65feacc240967a6f8edc: Status 404 returned error can't find the container with id f9d0f42bee65b27ae645b46deb95f8784c5882ace5df65feacc240967a6f8edc Apr 20 15:10:09.621351 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:09.621305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" event={"ID":"e68b109d-9188-4f00-965a-775902235d56","Type":"ContainerStarted","Data":"f9d0f42bee65b27ae645b46deb95f8784c5882ace5df65feacc240967a6f8edc"} Apr 20 15:10:11.631189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:11.631144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" event={"ID":"e68b109d-9188-4f00-965a-775902235d56","Type":"ContainerStarted","Data":"e6d0c30b174b306a3b33644148ddda6d8f45fa2060bcf07429d9607f91278555"} Apr 20 15:10:11.631714 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:11.631208 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:11.648391 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:11.648338 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" podStartSLOduration=1.3158015029999999 podStartE2EDuration="3.648320337s" podCreationTimestamp="2026-04-20 15:10:08 +0000 UTC" firstStartedPulling="2026-04-20 15:10:09.206057589 +0000 UTC m=+459.641601720" lastFinishedPulling="2026-04-20 15:10:11.538576422 +0000 UTC m=+461.974120554" observedRunningTime="2026-04-20 15:10:11.646512254 +0000 UTC m=+462.082056407" watchObservedRunningTime="2026-04-20 15:10:11.648320337 +0000 UTC m=+462.083864491" Apr 20 15:10:18.619145 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:18.619112 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-282sq" Apr 20 15:10:22.637939 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:22.637905 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-646fk" Apr 20 15:10:49.105187 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.105147 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79975f5f76-splr8"] Apr 20 15:10:49.108705 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.108682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.117905 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.117875 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79975f5f76-splr8"] Apr 20 15:10:49.201576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-oauth-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-oauth-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-console-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-service-ca\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-trusted-ca-bundle\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.201803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.201762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxdd\" (UniqueName: \"kubernetes.io/projected/017c3a70-c30c-4c24-a18f-60944ab58b29-kube-api-access-zgxdd\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302797 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-oauth-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-console-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-service-ca\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-trusted-ca-bundle\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxdd\" (UniqueName: \"kubernetes.io/projected/017c3a70-c30c-4c24-a18f-60944ab58b29-kube-api-access-zgxdd\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.302980 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.302950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-oauth-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.303719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.303683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-service-ca\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.303719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.303683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-console-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.303894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.303755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-oauth-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.303894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.303797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017c3a70-c30c-4c24-a18f-60944ab58b29-trusted-ca-bundle\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.305502 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.305474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-serving-cert\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.305605 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.305510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017c3a70-c30c-4c24-a18f-60944ab58b29-console-oauth-config\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.311029 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.311004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxdd\" (UniqueName: \"kubernetes.io/projected/017c3a70-c30c-4c24-a18f-60944ab58b29-kube-api-access-zgxdd\") pod \"console-79975f5f76-splr8\" (UID: \"017c3a70-c30c-4c24-a18f-60944ab58b29\") " pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.420238 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.420141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:49.546882 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.546855 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79975f5f76-splr8"] Apr 20 15:10:49.548968 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:10:49.548942 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017c3a70_c30c_4c24_a18f_60944ab58b29.slice/crio-4d5fd38e928b72a24fb1a355fe94bc57ae6d03626a021117954779e86c22c998 WatchSource:0}: Error finding container 4d5fd38e928b72a24fb1a355fe94bc57ae6d03626a021117954779e86c22c998: Status 404 returned error can't find the container with id 4d5fd38e928b72a24fb1a355fe94bc57ae6d03626a021117954779e86c22c998 Apr 20 15:10:49.760311 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.760245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79975f5f76-splr8" event={"ID":"017c3a70-c30c-4c24-a18f-60944ab58b29","Type":"ContainerStarted","Data":"afa55774ff7dd6d2c09002e12216034f9091f7035c7f7092eb4d3996632482e8"} Apr 20 15:10:49.760311 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.760296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79975f5f76-splr8" event={"ID":"017c3a70-c30c-4c24-a18f-60944ab58b29","Type":"ContainerStarted","Data":"4d5fd38e928b72a24fb1a355fe94bc57ae6d03626a021117954779e86c22c998"} Apr 20 15:10:49.777025 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:49.776971 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79975f5f76-splr8" podStartSLOduration=0.776953982 podStartE2EDuration="776.953982ms" podCreationTimestamp="2026-04-20 15:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:10:49.775504146 +0000 UTC m=+500.211048299" watchObservedRunningTime="2026-04-20 15:10:49.776953982 +0000 UTC m=+500.212498135" Apr 20 15:10:59.420683 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:59.420643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:59.420683 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:59.420693 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:59.425531 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:59.425500 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:59.801531 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:59.801498 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79975f5f76-splr8" Apr 20 15:10:59.844147 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:10:59.844108 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:11:22.321449 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.321360 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wh4hx"] Apr 20 15:11:22.324739 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.324720 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:22.326942 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.326912 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:11:22.327052 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.326919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:11:22.327680 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.327661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-7vb42\"" Apr 20 15:11:22.331570 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.331544 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wh4hx"] Apr 20 15:11:22.378212 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.378165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xq7v\" (UniqueName: \"kubernetes.io/projected/34a47557-4f99-476a-b736-08f8ffe4db64-kube-api-access-4xq7v\") pod \"kuadrant-operator-catalog-wh4hx\" (UID: \"34a47557-4f99-476a-b736-08f8ffe4db64\") " pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:22.478992 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.478942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xq7v\" (UniqueName: \"kubernetes.io/projected/34a47557-4f99-476a-b736-08f8ffe4db64-kube-api-access-4xq7v\") pod \"kuadrant-operator-catalog-wh4hx\" (UID: \"34a47557-4f99-476a-b736-08f8ffe4db64\") " pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:22.486869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.486831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xq7v\" (UniqueName: \"kubernetes.io/projected/34a47557-4f99-476a-b736-08f8ffe4db64-kube-api-access-4xq7v\") pod \"kuadrant-operator-catalog-wh4hx\" (UID: \"34a47557-4f99-476a-b736-08f8ffe4db64\") " pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:22.635858 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.635746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:22.762441 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.762405 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wh4hx"] Apr 20 15:11:22.765753 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:11:22.765725 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a47557_4f99_476a_b736_08f8ffe4db64.slice/crio-77674aff9f0705ca0a9ad99c5b293c977a850a2a168580dc586c7b6ad6d0d059 WatchSource:0}: Error finding container 77674aff9f0705ca0a9ad99c5b293c977a850a2a168580dc586c7b6ad6d0d059: Status 404 returned error can't find the container with id 77674aff9f0705ca0a9ad99c5b293c977a850a2a168580dc586c7b6ad6d0d059 Apr 20 15:11:22.876419 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:22.876381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" event={"ID":"34a47557-4f99-476a-b736-08f8ffe4db64","Type":"ContainerStarted","Data":"77674aff9f0705ca0a9ad99c5b293c977a850a2a168580dc586c7b6ad6d0d059"} Apr 20 15:11:24.866955 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:24.866911 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d9bf87884-scbxr" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" containerID="cri-o://8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9" gracePeriod=15 Apr 20 15:11:25.122860 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.122836 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9bf87884-scbxr_6789ba89-9e79-4d97-aa2d-d3ad018fe5fa/console/0.log" Apr 20 15:11:25.122997 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.122899 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:11:25.202904 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.202871 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48b8v\" (UniqueName: \"kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.202933 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.202965 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.202988 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203010 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203041 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203086 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203065 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca\") pod \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\" (UID: \"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa\") " Apr 20 15:11:25.203585 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203555 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:11:25.203696 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203586 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:11:25.203696 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203553 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:11:25.203696 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.203563 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config" (OuterVolumeSpecName: "console-config") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:11:25.205341 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.205310 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:11:25.205442 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.205391 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:11:25.205517 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.205434 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v" (OuterVolumeSpecName: "kube-api-access-48b8v") pod "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" (UID: "6789ba89-9e79-4d97-aa2d-d3ad018fe5fa"). InnerVolumeSpecName "kube-api-access-48b8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:25.304621 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304575 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304621 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304611 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-service-ca\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304621 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304623 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48b8v\" (UniqueName: \"kubernetes.io/projected/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-kube-api-access-48b8v\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304621 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304633 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-oauth-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304644 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-oauth-config\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304653 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-console-serving-cert\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.304898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.304662 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa-trusted-ca-bundle\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:11:25.889731 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889701 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9bf87884-scbxr_6789ba89-9e79-4d97-aa2d-d3ad018fe5fa/console/0.log" Apr 20 15:11:25.890189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889742 2577 generic.go:358] "Generic (PLEG): container finished" podID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerID="8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9" exitCode=2 Apr 20 15:11:25.890189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf87884-scbxr" event={"ID":"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa","Type":"ContainerDied","Data":"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9"} Apr 20 15:11:25.890189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf87884-scbxr" event={"ID":"6789ba89-9e79-4d97-aa2d-d3ad018fe5fa","Type":"ContainerDied","Data":"58f631cf951b2138ba39e3ec42d195761bc66eae68dc7b8d7e12954c845385ef"} Apr 20 15:11:25.890189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889838 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf87884-scbxr" Apr 20 15:11:25.890189 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.889840 2577 scope.go:117] "RemoveContainer" containerID="8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9" Apr 20 15:11:25.891435 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.891410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" event={"ID":"34a47557-4f99-476a-b736-08f8ffe4db64","Type":"ContainerStarted","Data":"0cc7d2ebd9d1b80a88ce3e88795581f46d5b078fc749fc176f3fcbd059815c14"} Apr 20 15:11:25.899379 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.899349 2577 scope.go:117] "RemoveContainer" containerID="8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9" Apr 20 15:11:25.899635 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:11:25.899615 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9\": container with ID starting with 8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9 not found: ID does not exist" containerID="8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9" Apr 20 15:11:25.899696 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.899644 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9"} err="failed to get container status \"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9\": rpc error: code = NotFound desc = could not find container \"8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9\": container with ID starting with 8b67b9075a34cedadf35b925434c704c5a43e05367085fa2227ccbbe92478bf9 not found: ID does not exist" Apr 20 15:11:25.908921 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.908876 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" podStartSLOduration=1.853165596 podStartE2EDuration="3.90886132s" podCreationTimestamp="2026-04-20 15:11:22 +0000 UTC" firstStartedPulling="2026-04-20 15:11:22.766933458 +0000 UTC m=+533.202477591" lastFinishedPulling="2026-04-20 15:11:24.822629183 +0000 UTC m=+535.258173315" observedRunningTime="2026-04-20 15:11:25.907843254 +0000 UTC m=+536.343387421" watchObservedRunningTime="2026-04-20 15:11:25.90886132 +0000 UTC m=+536.344405475" Apr 20 15:11:25.922171 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.922137 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:11:25.927634 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:25.927600 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d9bf87884-scbxr"] Apr 20 15:11:26.196220 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:26.196137 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" path="/var/lib/kubelet/pods/6789ba89-9e79-4d97-aa2d-d3ad018fe5fa/volumes" Apr 20 15:11:27.355904 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.355865 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf"] Apr 20 15:11:27.356324 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.356210 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" Apr 20 15:11:27.356324 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.356227 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" Apr 20 15:11:27.356324 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.356301 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6789ba89-9e79-4d97-aa2d-d3ad018fe5fa" containerName="console" Apr 20 15:11:27.361276 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.361254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.363684 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.363659 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:11:27.363824 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.363727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-4j5l9\"" Apr 20 15:11:27.369432 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.369405 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf"] Apr 20 15:11:27.423731 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ef984f14-1549-4653-b515-f92a6fa0b180-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.423927 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.423927 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.423927 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcw8f\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-kube-api-access-jcw8f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.423927 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.424090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.424090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.423983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ef984f14-1549-4653-b515-f92a6fa0b180-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.424090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.424013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.424090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.424031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525089 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ef984f14-1549-4653-b515-f92a6fa0b180-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525271 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525271 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525271 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ef984f14-1549-4653-b515-f92a6fa0b180-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525491 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525491 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525491 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcw8f\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-kube-api-access-jcw8f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525491 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525491 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525903 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.525903 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.525883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.526035 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.526014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.526208 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.526187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ef984f14-1549-4653-b515-f92a6fa0b180-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.527432 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.527414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ef984f14-1549-4653-b515-f92a6fa0b180-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.527670 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.527653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ef984f14-1549-4653-b515-f92a6fa0b180-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.533179 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.533147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.533596 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.533578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcw8f\" (UniqueName: \"kubernetes.io/projected/ef984f14-1549-4653-b515-f92a6fa0b180-kube-api-access-jcw8f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf\" (UID: \"ef984f14-1549-4653-b515-f92a6fa0b180\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.674313 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.674196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:27.801459 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.801432 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf"] Apr 20 15:11:27.804225 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:11:27.804189 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef984f14_1549_4653_b515_f92a6fa0b180.slice/crio-099f747ea383c9b2d120fabcac458f7a57ca48a8862e5eaa47287799b990257c WatchSource:0}: Error finding container 099f747ea383c9b2d120fabcac458f7a57ca48a8862e5eaa47287799b990257c: Status 404 returned error can't find the container with id 099f747ea383c9b2d120fabcac458f7a57ca48a8862e5eaa47287799b990257c Apr 20 15:11:27.900122 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:27.900087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" event={"ID":"ef984f14-1549-4653-b515-f92a6fa0b180","Type":"ContainerStarted","Data":"099f747ea383c9b2d120fabcac458f7a57ca48a8862e5eaa47287799b990257c"} Apr 20 15:11:30.481415 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:30.481372 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 15:11:30.481710 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:30.481457 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 15:11:30.481710 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:30.481485 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 15:11:30.912217 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:30.912179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" event={"ID":"ef984f14-1549-4653-b515-f92a6fa0b180","Type":"ContainerStarted","Data":"bcee62c58cd9cf7bb823373f8373706d6c48088c0344c0c6839727ef5980b6e5"} Apr 20 15:11:30.934705 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:30.934652 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" podStartSLOduration=1.259772958 podStartE2EDuration="3.934636225s" podCreationTimestamp="2026-04-20 15:11:27 +0000 UTC" firstStartedPulling="2026-04-20 15:11:27.806216936 +0000 UTC m=+538.241761068" lastFinishedPulling="2026-04-20 15:11:30.481080203 +0000 UTC m=+540.916624335" observedRunningTime="2026-04-20 15:11:30.932939491 +0000 UTC m=+541.368483657" watchObservedRunningTime="2026-04-20 15:11:30.934636225 +0000 UTC m=+541.370180379" Apr 20 15:11:31.674888 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:31.674836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:31.679817 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:31.679784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:31.915941 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:31.915908 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:31.916837 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:31.916813 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf" Apr 20 15:11:32.636192 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:32.636156 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:32.636192 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:32.636199 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:32.659112 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:32.659083 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:32.941115 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:32.941036 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-wh4hx" Apr 20 15:11:53.958803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:53.958766 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l"] Apr 20 15:11:53.962247 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:53.962230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:53.964643 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:53.964622 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-q7v27\"" Apr 20 15:11:53.964643 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:53.964630 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 15:11:53.969661 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:53.969638 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l"] Apr 20 15:11:54.057190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.057148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2lrd\" (UniqueName: \"kubernetes.io/projected/fd16d6fe-db4f-44b0-bc6c-b1bb4340172a-kube-api-access-d2lrd\") pod \"dns-operator-controller-manager-648d5c98bc-nb47l\" (UID: \"fd16d6fe-db4f-44b0-bc6c-b1bb4340172a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:54.158278 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.158240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2lrd\" (UniqueName: \"kubernetes.io/projected/fd16d6fe-db4f-44b0-bc6c-b1bb4340172a-kube-api-access-d2lrd\") pod \"dns-operator-controller-manager-648d5c98bc-nb47l\" (UID: \"fd16d6fe-db4f-44b0-bc6c-b1bb4340172a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:54.172716 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.172690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2lrd\" (UniqueName: \"kubernetes.io/projected/fd16d6fe-db4f-44b0-bc6c-b1bb4340172a-kube-api-access-d2lrd\") pod \"dns-operator-controller-manager-648d5c98bc-nb47l\" (UID: \"fd16d6fe-db4f-44b0-bc6c-b1bb4340172a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:54.272935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.272901 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:54.406382 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.406315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l"] Apr 20 15:11:54.408730 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:11:54.408701 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd16d6fe_db4f_44b0_bc6c_b1bb4340172a.slice/crio-580761c1aa9b6d50f34df4d7295cbc925ce944fb05a80bd0db1e9eeac5186684 WatchSource:0}: Error finding container 580761c1aa9b6d50f34df4d7295cbc925ce944fb05a80bd0db1e9eeac5186684: Status 404 returned error can't find the container with id 580761c1aa9b6d50f34df4d7295cbc925ce944fb05a80bd0db1e9eeac5186684 Apr 20 15:11:54.992504 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:54.992470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" event={"ID":"fd16d6fe-db4f-44b0-bc6c-b1bb4340172a","Type":"ContainerStarted","Data":"580761c1aa9b6d50f34df4d7295cbc925ce944fb05a80bd0db1e9eeac5186684"} Apr 20 15:11:58.009511 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:58.009471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" event={"ID":"fd16d6fe-db4f-44b0-bc6c-b1bb4340172a","Type":"ContainerStarted","Data":"ee895ecacda59e36bb8634a1157ebb632d4d1cda1d5762d5405dd89bb4b33162"} Apr 20 15:11:58.009965 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:58.009589 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:11:58.026382 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:11:58.026322 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" podStartSLOduration=2.197207429 podStartE2EDuration="5.026277896s" podCreationTimestamp="2026-04-20 15:11:53 +0000 UTC" firstStartedPulling="2026-04-20 15:11:54.41061593 +0000 UTC m=+564.846160063" lastFinishedPulling="2026-04-20 15:11:57.239686386 +0000 UTC m=+567.675230530" observedRunningTime="2026-04-20 15:11:58.02497524 +0000 UTC m=+568.460519394" watchObservedRunningTime="2026-04-20 15:11:58.026277896 +0000 UTC m=+568.461822051" Apr 20 15:12:02.657249 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.657214 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-n62pm"] Apr 20 15:12:02.664266 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.664248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:02.666563 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.666544 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-k22w2\"" Apr 20 15:12:02.670966 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.670938 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-n62pm"] Apr 20 15:12:02.736990 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.736951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzvv\" (UniqueName: \"kubernetes.io/projected/3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee-kube-api-access-2tzvv\") pod \"authorino-operator-657f44b778-n62pm\" (UID: \"3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee\") " pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:02.838445 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.838404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzvv\" (UniqueName: \"kubernetes.io/projected/3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee-kube-api-access-2tzvv\") pod \"authorino-operator-657f44b778-n62pm\" (UID: \"3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee\") " pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:02.849453 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.849419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzvv\" (UniqueName: \"kubernetes.io/projected/3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee-kube-api-access-2tzvv\") pod \"authorino-operator-657f44b778-n62pm\" (UID: \"3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee\") " pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:02.975457 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:02.975420 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:03.106760 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:03.106453 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-n62pm"] Apr 20 15:12:03.118653 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:12:03.110915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bacaaa3_bf99_43e3_ac8d_b4f397c4c9ee.slice/crio-25d64bcbe835894047d3f147cb8fe41f4ef41b45a73c2115b3c9a248dbf84569 WatchSource:0}: Error finding container 25d64bcbe835894047d3f147cb8fe41f4ef41b45a73c2115b3c9a248dbf84569: Status 404 returned error can't find the container with id 25d64bcbe835894047d3f147cb8fe41f4ef41b45a73c2115b3c9a248dbf84569 Apr 20 15:12:04.032190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:04.032127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" event={"ID":"3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee","Type":"ContainerStarted","Data":"25d64bcbe835894047d3f147cb8fe41f4ef41b45a73c2115b3c9a248dbf84569"} Apr 20 15:12:06.041199 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:06.041165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" event={"ID":"3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee","Type":"ContainerStarted","Data":"dbc5dc67a59d32de4758dc779979e08a324cc56e85dfb714dbdad2b71ffea809"} Apr 20 15:12:06.041637 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:06.041266 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:06.064408 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:06.064357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" podStartSLOduration=1.690141237 podStartE2EDuration="4.064341484s" podCreationTimestamp="2026-04-20 15:12:02 +0000 UTC" firstStartedPulling="2026-04-20 15:12:03.113321639 +0000 UTC m=+573.548865770" lastFinishedPulling="2026-04-20 15:12:05.487521885 +0000 UTC m=+575.923066017" observedRunningTime="2026-04-20 15:12:06.063041445 +0000 UTC m=+576.498585601" watchObservedRunningTime="2026-04-20 15:12:06.064341484 +0000 UTC m=+576.499885637" Apr 20 15:12:08.294897 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.294862 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m"] Apr 20 15:12:08.297362 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.297341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.300017 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.299989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qnq8w\"" Apr 20 15:12:08.300154 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.299988 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 15:12:08.300154 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.299995 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 15:12:08.305822 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.305800 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m"] Apr 20 15:12:08.389725 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.389686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.389901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.389734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db3f3f71-699c-432f-a05e-dadebeccd795-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.389901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.389819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqvw\" (UniqueName: \"kubernetes.io/projected/db3f3f71-699c-432f-a05e-dadebeccd795-kube-api-access-fsqvw\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.490422 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.490382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqvw\" (UniqueName: \"kubernetes.io/projected/db3f3f71-699c-432f-a05e-dadebeccd795-kube-api-access-fsqvw\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.490587 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.490474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.490587 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.490501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db3f3f71-699c-432f-a05e-dadebeccd795-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.490658 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:12:08.490612 2577 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 15:12:08.490691 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:12:08.490683 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert podName:db3f3f71-699c-432f-a05e-dadebeccd795 nodeName:}" failed. No retries permitted until 2026-04-20 15:12:08.990665173 +0000 UTC m=+579.426209309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-wd94m" (UID: "db3f3f71-699c-432f-a05e-dadebeccd795") : secret "plugin-serving-cert" not found Apr 20 15:12:08.491090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.491073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db3f3f71-699c-432f-a05e-dadebeccd795-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.498679 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.498661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqvw\" (UniqueName: \"kubernetes.io/projected/db3f3f71-699c-432f-a05e-dadebeccd795-kube-api-access-fsqvw\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.995170 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:08.995127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:08.995387 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:12:08.995276 2577 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 15:12:08.995387 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:12:08.995374 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert podName:db3f3f71-699c-432f-a05e-dadebeccd795 nodeName:}" failed. No retries permitted until 2026-04-20 15:12:09.995355936 +0000 UTC m=+580.430900067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-wd94m" (UID: "db3f3f71-699c-432f-a05e-dadebeccd795") : secret "plugin-serving-cert" not found Apr 20 15:12:09.015712 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:09.015681 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-nb47l" Apr 20 15:12:10.005703 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:10.005656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:10.008255 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:10.008227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3f3f71-699c-432f-a05e-dadebeccd795-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-wd94m\" (UID: \"db3f3f71-699c-432f-a05e-dadebeccd795\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:10.107437 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:10.107378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" Apr 20 15:12:10.237940 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:10.237905 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m"] Apr 20 15:12:10.241534 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:12:10.241495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb3f3f71_699c_432f_a05e_dadebeccd795.slice/crio-980e9080146182b4629fa290f5dc9c8b9351660e6d844712eabbaf84f60ba264 WatchSource:0}: Error finding container 980e9080146182b4629fa290f5dc9c8b9351660e6d844712eabbaf84f60ba264: Status 404 returned error can't find the container with id 980e9080146182b4629fa290f5dc9c8b9351660e6d844712eabbaf84f60ba264 Apr 20 15:12:11.059188 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:11.059150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" event={"ID":"db3f3f71-699c-432f-a05e-dadebeccd795","Type":"ContainerStarted","Data":"980e9080146182b4629fa290f5dc9c8b9351660e6d844712eabbaf84f60ba264"} Apr 20 15:12:17.047549 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:17.047515 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-n62pm" Apr 20 15:12:18.884329 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.884118 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:12:18.887011 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.886983 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:18.891660 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.891638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8b6tn\"" Apr 20 15:12:18.923688 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.923652 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:12:18.991275 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.991237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:18.991539 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:18.991275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxs9\" (UniqueName: \"kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.092062 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.092020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.092062 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.092061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxs9\" (UniqueName: \"kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.092473 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.092454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.102204 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.102176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxs9\" (UniqueName: \"kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4x42h\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.197877 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.197782 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:19.328405 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:19.328353 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:12:19.331602 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:12:19.331574 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff967ac9_acbb_4159_904d_367dc3f8d6db.slice/crio-68870cbfe1d45fbba7bcc911efb37f9d9dfde7d6aa81262a2af80772c1564317 WatchSource:0}: Error finding container 68870cbfe1d45fbba7bcc911efb37f9d9dfde7d6aa81262a2af80772c1564317: Status 404 returned error can't find the container with id 68870cbfe1d45fbba7bcc911efb37f9d9dfde7d6aa81262a2af80772c1564317 Apr 20 15:12:20.093349 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:20.093275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" event={"ID":"ff967ac9-acbb-4159-904d-367dc3f8d6db","Type":"ContainerStarted","Data":"68870cbfe1d45fbba7bcc911efb37f9d9dfde7d6aa81262a2af80772c1564317"} Apr 20 15:12:24.111927 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:24.111881 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" event={"ID":"ff967ac9-acbb-4159-904d-367dc3f8d6db","Type":"ContainerStarted","Data":"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0"} Apr 20 15:12:24.112440 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:24.112055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:24.139951 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:24.139877 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" podStartSLOduration=1.8978962579999998 podStartE2EDuration="6.139855817s" podCreationTimestamp="2026-04-20 15:12:18 +0000 UTC" firstStartedPulling="2026-04-20 15:12:19.333964517 +0000 UTC m=+589.769508650" lastFinishedPulling="2026-04-20 15:12:23.575924074 +0000 UTC m=+594.011468209" observedRunningTime="2026-04-20 15:12:24.139242042 +0000 UTC m=+594.574786198" watchObservedRunningTime="2026-04-20 15:12:24.139855817 +0000 UTC m=+594.575399972" Apr 20 15:12:35.119013 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:35.118975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:12:38.325202 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:38.325170 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:12:38.325734 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:38.325417 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:12:39.183280 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:39.183239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" event={"ID":"db3f3f71-699c-432f-a05e-dadebeccd795","Type":"ContainerStarted","Data":"4416b15ca3663f49636bca4520c2bdd1293b7f75670459a56157c5c8b3534990"} Apr 20 15:12:39.200241 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:12:39.200179 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-wd94m" podStartSLOduration=3.074658073 podStartE2EDuration="31.200164696s" podCreationTimestamp="2026-04-20 15:12:08 +0000 UTC" firstStartedPulling="2026-04-20 15:12:10.243001391 +0000 UTC m=+580.678545527" lastFinishedPulling="2026-04-20 15:12:38.368508018 +0000 UTC m=+608.804052150" observedRunningTime="2026-04-20 15:12:39.197992029 +0000 UTC m=+609.633536175" watchObservedRunningTime="2026-04-20 15:12:39.200164696 +0000 UTC m=+609.635708849" Apr 20 15:13:24.283935 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.283896 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:24.286368 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.286348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:24.288894 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.288871 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-49rcd\"" Apr 20 15:13:24.294991 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.294730 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:24.367498 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.367455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps5kf\" (UniqueName: \"kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf\") pod \"authorino-88f8db554-bwhp7\" (UID: \"9d683a92-0a77-4299-bf1c-14fe31aaa827\") " pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:24.468658 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.468614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps5kf\" (UniqueName: \"kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf\") pod \"authorino-88f8db554-bwhp7\" (UID: \"9d683a92-0a77-4299-bf1c-14fe31aaa827\") " pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:24.478901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.478862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps5kf\" (UniqueName: \"kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf\") pod \"authorino-88f8db554-bwhp7\" (UID: \"9d683a92-0a77-4299-bf1c-14fe31aaa827\") " pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:24.509625 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.509584 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:24.509917 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.509899 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:24.659678 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.659651 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:24.662376 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:13:24.662344 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d683a92_0a77_4299_bf1c_14fe31aaa827.slice/crio-7e8b059aac2a6f93d95c0aff528876baa4fdff710e87bab004072cd5d23c7cbf WatchSource:0}: Error finding container 7e8b059aac2a6f93d95c0aff528876baa4fdff710e87bab004072cd5d23c7cbf: Status 404 returned error can't find the container with id 7e8b059aac2a6f93d95c0aff528876baa4fdff710e87bab004072cd5d23c7cbf Apr 20 15:13:24.663974 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:24.663953 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:13:25.344584 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:25.344522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-88f8db554-bwhp7" event={"ID":"9d683a92-0a77-4299-bf1c-14fe31aaa827","Type":"ContainerStarted","Data":"7e8b059aac2a6f93d95c0aff528876baa4fdff710e87bab004072cd5d23c7cbf"} Apr 20 15:13:27.354348 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.354310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-88f8db554-bwhp7" event={"ID":"9d683a92-0a77-4299-bf1c-14fe31aaa827","Type":"ContainerStarted","Data":"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f"} Apr 20 15:13:27.354762 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.354392 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-88f8db554-bwhp7" podUID="9d683a92-0a77-4299-bf1c-14fe31aaa827" containerName="authorino" containerID="cri-o://1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f" gracePeriod=30 Apr 20 15:13:27.377182 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.376672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-88f8db554-bwhp7" podStartSLOduration=1.1003696 podStartE2EDuration="3.376651581s" podCreationTimestamp="2026-04-20 15:13:24 +0000 UTC" firstStartedPulling="2026-04-20 15:13:24.664111515 +0000 UTC m=+655.099655646" lastFinishedPulling="2026-04-20 15:13:26.940393492 +0000 UTC m=+657.375937627" observedRunningTime="2026-04-20 15:13:27.375105105 +0000 UTC m=+657.810649262" watchObservedRunningTime="2026-04-20 15:13:27.376651581 +0000 UTC m=+657.812195737" Apr 20 15:13:27.602296 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.602264 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:27.698169 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.698066 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps5kf\" (UniqueName: \"kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf\") pod \"9d683a92-0a77-4299-bf1c-14fe31aaa827\" (UID: \"9d683a92-0a77-4299-bf1c-14fe31aaa827\") " Apr 20 15:13:27.700433 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.700406 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf" (OuterVolumeSpecName: "kube-api-access-ps5kf") pod "9d683a92-0a77-4299-bf1c-14fe31aaa827" (UID: "9d683a92-0a77-4299-bf1c-14fe31aaa827"). InnerVolumeSpecName "kube-api-access-ps5kf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:27.798709 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:27.798646 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ps5kf\" (UniqueName: \"kubernetes.io/projected/9d683a92-0a77-4299-bf1c-14fe31aaa827-kube-api-access-ps5kf\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:13:28.359486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.359445 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d683a92-0a77-4299-bf1c-14fe31aaa827" containerID="1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f" exitCode=0 Apr 20 15:13:28.359963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.359499 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-88f8db554-bwhp7" Apr 20 15:13:28.359963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.359544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-88f8db554-bwhp7" event={"ID":"9d683a92-0a77-4299-bf1c-14fe31aaa827","Type":"ContainerDied","Data":"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f"} Apr 20 15:13:28.359963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.359589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-88f8db554-bwhp7" event={"ID":"9d683a92-0a77-4299-bf1c-14fe31aaa827","Type":"ContainerDied","Data":"7e8b059aac2a6f93d95c0aff528876baa4fdff710e87bab004072cd5d23c7cbf"} Apr 20 15:13:28.359963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.359615 2577 scope.go:117] "RemoveContainer" containerID="1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f" Apr 20 15:13:28.367885 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.367865 2577 scope.go:117] "RemoveContainer" containerID="1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f" Apr 20 15:13:28.368168 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:13:28.368149 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f\": container with ID starting with 1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f not found: ID does not exist" containerID="1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f" Apr 20 15:13:28.368212 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.368180 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f"} err="failed to get container status \"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f\": rpc error: code = NotFound desc = could not find container \"1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f\": container with ID starting with 1c68d243a476c83fdece446515fa67657516c4bc078159fc228e90d77b88566f not found: ID does not exist" Apr 20 15:13:28.374410 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.374384 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:28.376210 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:28.376185 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-88f8db554-bwhp7"] Apr 20 15:13:30.196545 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:13:30.196511 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d683a92-0a77-4299-bf1c-14fe31aaa827" path="/var/lib/kubelet/pods/9d683a92-0a77-4299-bf1c-14fe31aaa827/volumes" Apr 20 15:14:11.890325 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.890228 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl"] Apr 20 15:14:11.890838 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.890725 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d683a92-0a77-4299-bf1c-14fe31aaa827" containerName="authorino" Apr 20 15:14:11.890838 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.890740 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d683a92-0a77-4299-bf1c-14fe31aaa827" containerName="authorino" Apr 20 15:14:11.890838 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.890834 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d683a92-0a77-4299-bf1c-14fe31aaa827" containerName="authorino" Apr 20 15:14:11.893259 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.893243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.896921 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.896901 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 15:14:11.897091 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.896905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-68bzf\"" Apr 20 15:14:11.897091 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.896952 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:14:11.897091 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.896905 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:14:11.907404 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.907373 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl"] Apr 20 15:14:11.982340 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.982554 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l79w\" (UniqueName: \"kubernetes.io/projected/86d3a929-df38-4ee6-993c-de18d9db2ae8-kube-api-access-8l79w\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.982554 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86d3a929-df38-4ee6-993c-de18d9db2ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.982554 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.982688 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:11.982688 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:11.982611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86d3a929-df38-4ee6-993c-de18d9db2ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083740 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083740 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083740 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.083740 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.083652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l79w\" (UniqueName: \"kubernetes.io/projected/86d3a929-df38-4ee6-993c-de18d9db2ae8-kube-api-access-8l79w\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.084059 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.084017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.084059 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.084044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.084219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.084074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.085847 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.085828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86d3a929-df38-4ee6-993c-de18d9db2ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.086036 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.086020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86d3a929-df38-4ee6-993c-de18d9db2ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.092161 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.092138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l79w\" (UniqueName: \"kubernetes.io/projected/86d3a929-df38-4ee6-993c-de18d9db2ae8-kube-api-access-8l79w\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl\" (UID: \"86d3a929-df38-4ee6-993c-de18d9db2ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.203437 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.203349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:12.336772 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.336741 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl"] Apr 20 15:14:12.339549 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:14:12.339501 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d3a929_df38_4ee6_993c_de18d9db2ae8.slice/crio-e9b4b2d7641a9d75c3f878f584798674bb4322d99aa4282e242e46e8876ae6e5 WatchSource:0}: Error finding container e9b4b2d7641a9d75c3f878f584798674bb4322d99aa4282e242e46e8876ae6e5: Status 404 returned error can't find the container with id e9b4b2d7641a9d75c3f878f584798674bb4322d99aa4282e242e46e8876ae6e5 Apr 20 15:14:12.514337 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:12.514300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" event={"ID":"86d3a929-df38-4ee6-993c-de18d9db2ae8","Type":"ContainerStarted","Data":"e9b4b2d7641a9d75c3f878f584798674bb4322d99aa4282e242e46e8876ae6e5"} Apr 20 15:14:18.541176 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:18.541130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" event={"ID":"86d3a929-df38-4ee6-993c-de18d9db2ae8","Type":"ContainerStarted","Data":"ec7e0a3e67c3cd0941f5ac1e7749e5d2747e177fc5bb3a2e45a9955bab55ec07"} Apr 20 15:14:26.572524 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:26.572428 2577 generic.go:358] "Generic (PLEG): container finished" podID="86d3a929-df38-4ee6-993c-de18d9db2ae8" containerID="ec7e0a3e67c3cd0941f5ac1e7749e5d2747e177fc5bb3a2e45a9955bab55ec07" exitCode=0 Apr 20 15:14:26.572524 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:26.572496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" event={"ID":"86d3a929-df38-4ee6-993c-de18d9db2ae8","Type":"ContainerDied","Data":"ec7e0a3e67c3cd0941f5ac1e7749e5d2747e177fc5bb3a2e45a9955bab55ec07"} Apr 20 15:14:28.581969 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:28.581931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" event={"ID":"86d3a929-df38-4ee6-993c-de18d9db2ae8","Type":"ContainerStarted","Data":"39158f0a352badc066a563cfc415866b33ea1ee89026b48125ffbbeacb6210d6"} Apr 20 15:14:28.582475 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:28.582147 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:28.602576 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:28.602526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" podStartSLOduration=2.241377316 podStartE2EDuration="17.602509958s" podCreationTimestamp="2026-04-20 15:14:11 +0000 UTC" firstStartedPulling="2026-04-20 15:14:12.341508956 +0000 UTC m=+702.777053088" lastFinishedPulling="2026-04-20 15:14:27.702641595 +0000 UTC m=+718.138185730" observedRunningTime="2026-04-20 15:14:28.600653754 +0000 UTC m=+719.036197910" watchObservedRunningTime="2026-04-20 15:14:28.602509958 +0000 UTC m=+719.038054111" Apr 20 15:14:39.599476 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:39.599434 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl" Apr 20 15:14:59.794864 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.794829 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw"] Apr 20 15:14:59.798718 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.798689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.801337 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.801314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 15:14:59.807746 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.807719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw"] Apr 20 15:14:59.939623 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/980008de-3ac8-4794-8355-81ecf568e2d3-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.939623 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.939844 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.939844 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.939844 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:14:59.939844 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:14:59.939829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cmp\" (UniqueName: \"kubernetes.io/projected/980008de-3ac8-4794-8355-81ecf568e2d3-kube-api-access-x6cmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cmp\" (UniqueName: \"kubernetes.io/projected/980008de-3ac8-4794-8355-81ecf568e2d3-kube-api-access-x6cmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/980008de-3ac8-4794-8355-81ecf568e2d3-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.040931 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.040799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.041138 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.041105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.041242 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.041221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.041317 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.041259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.043138 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.043115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/980008de-3ac8-4794-8355-81ecf568e2d3-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.043415 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.043398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/980008de-3ac8-4794-8355-81ecf568e2d3-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.048800 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.048744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cmp\" (UniqueName: \"kubernetes.io/projected/980008de-3ac8-4794-8355-81ecf568e2d3-kube-api-access-x6cmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw\" (UID: \"980008de-3ac8-4794-8355-81ecf568e2d3\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.109852 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.109814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:00.142233 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.142194 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:15:00.148038 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.148016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:15:00.150639 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.150609 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5m5vb\"" Apr 20 15:15:00.151802 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.151754 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:15:00.242690 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.242654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp592\" (UniqueName: \"kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592\") pod \"maas-api-key-cleanup-29611635-7h2ql\" (UID: \"f28beb24-a23c-4127-9919-1ac38c95493f\") " pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:15:00.252871 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.252837 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw"] Apr 20 15:15:00.255420 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:15:00.255391 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980008de_3ac8_4794_8355_81ecf568e2d3.slice/crio-6ea7275a9160dec5813836be4f933c0961d2dd832b6ad4e4665d0edc7e035cad WatchSource:0}: Error finding container 6ea7275a9160dec5813836be4f933c0961d2dd832b6ad4e4665d0edc7e035cad: Status 404 returned error can't find the container with id 6ea7275a9160dec5813836be4f933c0961d2dd832b6ad4e4665d0edc7e035cad Apr 20 15:15:00.344066 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.344025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp592\" (UniqueName: \"kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592\") pod \"maas-api-key-cleanup-29611635-7h2ql\" (UID: \"f28beb24-a23c-4127-9919-1ac38c95493f\") " pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:15:00.352140 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.352108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp592\" (UniqueName: \"kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592\") pod \"maas-api-key-cleanup-29611635-7h2ql\" (UID: \"f28beb24-a23c-4127-9919-1ac38c95493f\") " pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:15:00.462718 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.462678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:15:00.599306 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.599252 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:15:00.600669 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:15:00.600637 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf28beb24_a23c_4127_9919_1ac38c95493f.slice/crio-520858c44456afd3ea09090332874b0c4c7124ac45bf5d58d17ab4c74e64fbc9 WatchSource:0}: Error finding container 520858c44456afd3ea09090332874b0c4c7124ac45bf5d58d17ab4c74e64fbc9: Status 404 returned error can't find the container with id 520858c44456afd3ea09090332874b0c4c7124ac45bf5d58d17ab4c74e64fbc9 Apr 20 15:15:00.698775 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.698728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" event={"ID":"980008de-3ac8-4794-8355-81ecf568e2d3","Type":"ContainerStarted","Data":"ea04e729e3b1405593a213c37d83396f3b989162f5dad741799a5c4a3d4788e5"} Apr 20 15:15:00.698775 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.698780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" event={"ID":"980008de-3ac8-4794-8355-81ecf568e2d3","Type":"ContainerStarted","Data":"6ea7275a9160dec5813836be4f933c0961d2dd832b6ad4e4665d0edc7e035cad"} Apr 20 15:15:00.699986 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:00.699955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerStarted","Data":"520858c44456afd3ea09090332874b0c4c7124ac45bf5d58d17ab4c74e64fbc9"} Apr 20 15:15:03.712259 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:03.712220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerStarted","Data":"08a36d33b74773850f46ec1796a9105100209cd6eb8b37d8fa375aed88276b34"} Apr 20 15:15:03.727969 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:03.727914 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" podStartSLOduration=1.880184418 podStartE2EDuration="3.72789641s" podCreationTimestamp="2026-04-20 15:15:00 +0000 UTC" firstStartedPulling="2026-04-20 15:15:00.602637981 +0000 UTC m=+751.038182113" lastFinishedPulling="2026-04-20 15:15:02.450349969 +0000 UTC m=+752.885894105" observedRunningTime="2026-04-20 15:15:03.727331553 +0000 UTC m=+754.162875711" watchObservedRunningTime="2026-04-20 15:15:03.72789641 +0000 UTC m=+754.163440563" Apr 20 15:15:06.729332 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:06.729266 2577 generic.go:358] "Generic (PLEG): container finished" podID="980008de-3ac8-4794-8355-81ecf568e2d3" containerID="ea04e729e3b1405593a213c37d83396f3b989162f5dad741799a5c4a3d4788e5" exitCode=0 Apr 20 15:15:06.729869 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:06.729344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" event={"ID":"980008de-3ac8-4794-8355-81ecf568e2d3","Type":"ContainerDied","Data":"ea04e729e3b1405593a213c37d83396f3b989162f5dad741799a5c4a3d4788e5"} Apr 20 15:15:07.734592 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:07.734551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" event={"ID":"980008de-3ac8-4794-8355-81ecf568e2d3","Type":"ContainerStarted","Data":"63ae5ee09379ce9405bac25114144cbd6d1693f93334570a978371d1905fd45d"} Apr 20 15:15:07.734984 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:07.734780 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:07.755762 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:07.755707 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" podStartSLOduration=8.49204663 podStartE2EDuration="8.75568559s" podCreationTimestamp="2026-04-20 15:14:59 +0000 UTC" firstStartedPulling="2026-04-20 15:15:06.730155896 +0000 UTC m=+757.165700028" lastFinishedPulling="2026-04-20 15:15:06.993794856 +0000 UTC m=+757.429338988" observedRunningTime="2026-04-20 15:15:07.753664204 +0000 UTC m=+758.189208359" watchObservedRunningTime="2026-04-20 15:15:07.75568559 +0000 UTC m=+758.191229744" Apr 20 15:15:18.751811 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:18.751773 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw" Apr 20 15:15:23.794053 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:23.794015 2577 generic.go:358] "Generic (PLEG): container finished" podID="f28beb24-a23c-4127-9919-1ac38c95493f" containerID="08a36d33b74773850f46ec1796a9105100209cd6eb8b37d8fa375aed88276b34" exitCode=6 Apr 20 15:15:23.794486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:23.794089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerDied","Data":"08a36d33b74773850f46ec1796a9105100209cd6eb8b37d8fa375aed88276b34"} Apr 20 15:15:23.794486 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:23.794463 2577 scope.go:117] "RemoveContainer" containerID="08a36d33b74773850f46ec1796a9105100209cd6eb8b37d8fa375aed88276b34" Apr 20 15:15:24.799721 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:24.799686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerStarted","Data":"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a"} Apr 20 15:15:37.697088 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.697049 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8"] Apr 20 15:15:37.702239 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.702213 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.704622 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.704595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 15:15:37.712847 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.712816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8"] Apr 20 15:15:37.805348 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.805544 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea02a932-86e0-416f-9299-9fc5ffeae749-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.805544 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.805544 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85b9\" (UniqueName: \"kubernetes.io/projected/ea02a932-86e0-416f-9299-9fc5ffeae749-kube-api-access-t85b9\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.805544 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.805695 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.805553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906395 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea02a932-86e0-416f-9299-9fc5ffeae749-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906448 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t85b9\" (UniqueName: \"kubernetes.io/projected/ea02a932-86e0-416f-9299-9fc5ffeae749-kube-api-access-t85b9\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906881 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.906958 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.907016 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.906998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.908827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.908806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea02a932-86e0-416f-9299-9fc5ffeae749-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.909081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.909064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea02a932-86e0-416f-9299-9fc5ffeae749-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:37.914843 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:37.914818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85b9\" (UniqueName: \"kubernetes.io/projected/ea02a932-86e0-416f-9299-9fc5ffeae749-kube-api-access-t85b9\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-72sb8\" (UID: \"ea02a932-86e0-416f-9299-9fc5ffeae749\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:38.014322 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:38.014264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:38.148539 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:38.148503 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8"] Apr 20 15:15:38.159181 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:15:38.153544 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea02a932_86e0_416f_9299_9fc5ffeae749.slice/crio-4fe34a0d321067c6d8a46899646fce299b48f80075da5d03f8a338bcc5f96070 WatchSource:0}: Error finding container 4fe34a0d321067c6d8a46899646fce299b48f80075da5d03f8a338bcc5f96070: Status 404 returned error can't find the container with id 4fe34a0d321067c6d8a46899646fce299b48f80075da5d03f8a338bcc5f96070 Apr 20 15:15:38.860201 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:38.860155 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" event={"ID":"ea02a932-86e0-416f-9299-9fc5ffeae749","Type":"ContainerStarted","Data":"7ec050056bb0912f355a72a8331624c57be990c3bdd20b5062f55048d7154e75"} Apr 20 15:15:38.860201 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:38.860206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" event={"ID":"ea02a932-86e0-416f-9299-9fc5ffeae749","Type":"ContainerStarted","Data":"4fe34a0d321067c6d8a46899646fce299b48f80075da5d03f8a338bcc5f96070"} Apr 20 15:15:43.881024 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:43.880989 2577 generic.go:358] "Generic (PLEG): container finished" podID="ea02a932-86e0-416f-9299-9fc5ffeae749" containerID="7ec050056bb0912f355a72a8331624c57be990c3bdd20b5062f55048d7154e75" exitCode=0 Apr 20 15:15:43.881414 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:43.881061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" event={"ID":"ea02a932-86e0-416f-9299-9fc5ffeae749","Type":"ContainerDied","Data":"7ec050056bb0912f355a72a8331624c57be990c3bdd20b5062f55048d7154e75"} Apr 20 15:15:44.886448 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.886351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" event={"ID":"ea02a932-86e0-416f-9299-9fc5ffeae749","Type":"ContainerStarted","Data":"c488cfd75f7b86591f019da97a7da1921cbf6fa0264438542e7fbe2f1f900883"} Apr 20 15:15:44.886862 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.886587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:44.888062 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.888037 2577 generic.go:358] "Generic (PLEG): container finished" podID="f28beb24-a23c-4127-9919-1ac38c95493f" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" exitCode=6 Apr 20 15:15:44.888179 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.888072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerDied","Data":"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a"} Apr 20 15:15:44.888179 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.888096 2577 scope.go:117] "RemoveContainer" containerID="08a36d33b74773850f46ec1796a9105100209cd6eb8b37d8fa375aed88276b34" Apr 20 15:15:44.888415 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.888396 2577 scope.go:117] "RemoveContainer" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" Apr 20 15:15:44.888673 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:15:44.888650 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611635-7h2ql_opendatahub(f28beb24-a23c-4127-9919-1ac38c95493f)\"" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" Apr 20 15:15:44.907078 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:44.907031 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" podStartSLOduration=7.617802176 podStartE2EDuration="7.907015897s" podCreationTimestamp="2026-04-20 15:15:37 +0000 UTC" firstStartedPulling="2026-04-20 15:15:43.881743441 +0000 UTC m=+794.317287573" lastFinishedPulling="2026-04-20 15:15:44.170957162 +0000 UTC m=+794.606501294" observedRunningTime="2026-04-20 15:15:44.904216486 +0000 UTC m=+795.339760640" watchObservedRunningTime="2026-04-20 15:15:44.907015897 +0000 UTC m=+795.342560051" Apr 20 15:15:55.907123 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:55.907089 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-72sb8" Apr 20 15:15:56.192041 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:56.191938 2577 scope.go:117] "RemoveContainer" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" Apr 20 15:15:56.935155 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:56.935069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerStarted","Data":"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef"} Apr 20 15:15:57.219034 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:57.218996 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:15:57.939303 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:15:57.939244 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" containerID="cri-o://38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef" gracePeriod=30 Apr 20 15:16:16.979359 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:16.979333 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:16:17.010190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.010151 2577 generic.go:358] "Generic (PLEG): container finished" podID="f28beb24-a23c-4127-9919-1ac38c95493f" containerID="38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef" exitCode=6 Apr 20 15:16:17.010404 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.010234 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" Apr 20 15:16:17.010404 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.010246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerDied","Data":"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef"} Apr 20 15:16:17.010404 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.010275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-7h2ql" event={"ID":"f28beb24-a23c-4127-9919-1ac38c95493f","Type":"ContainerDied","Data":"520858c44456afd3ea09090332874b0c4c7124ac45bf5d58d17ab4c74e64fbc9"} Apr 20 15:16:17.010404 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.010319 2577 scope.go:117] "RemoveContainer" containerID="38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef" Apr 20 15:16:17.022154 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.022133 2577 scope.go:117] "RemoveContainer" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" Apr 20 15:16:17.032856 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.032813 2577 scope.go:117] "RemoveContainer" containerID="38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef" Apr 20 15:16:17.033245 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:16:17.033224 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef\": container with ID starting with 38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef not found: ID does not exist" containerID="38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef" Apr 20 15:16:17.033452 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.033259 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef"} err="failed to get container status \"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef\": rpc error: code = NotFound desc = could not find container \"38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef\": container with ID starting with 38035a73f5855daed1be8cea98f6f803ffd7f87f9ecb2db5ea4acbe76994e6ef not found: ID does not exist" Apr 20 15:16:17.033452 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.033307 2577 scope.go:117] "RemoveContainer" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" Apr 20 15:16:17.033614 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:16:17.033592 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a\": container with ID starting with 3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a not found: ID does not exist" containerID="3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a" Apr 20 15:16:17.033662 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.033638 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a"} err="failed to get container status \"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a\": rpc error: code = NotFound desc = could not find container \"3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a\": container with ID starting with 3f9a3d8f7a20a2cda4ffef32a2781007e8a781967f08386c8b711bb852416a0a not found: ID does not exist" Apr 20 15:16:17.075301 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.075236 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp592\" (UniqueName: \"kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592\") pod \"f28beb24-a23c-4127-9919-1ac38c95493f\" (UID: \"f28beb24-a23c-4127-9919-1ac38c95493f\") " Apr 20 15:16:17.077528 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.077501 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592" (OuterVolumeSpecName: "kube-api-access-rp592") pod "f28beb24-a23c-4127-9919-1ac38c95493f" (UID: "f28beb24-a23c-4127-9919-1ac38c95493f"). InnerVolumeSpecName "kube-api-access-rp592". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:16:17.176993 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.176893 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rp592\" (UniqueName: \"kubernetes.io/projected/f28beb24-a23c-4127-9919-1ac38c95493f-kube-api-access-rp592\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:16:17.332383 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.332348 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:16:17.336242 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:17.336208 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-7h2ql"] Apr 20 15:16:18.196054 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:16:18.196017 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" path="/var/lib/kubelet/pods/f28beb24-a23c-4127-9919-1ac38c95493f/volumes" Apr 20 15:17:38.352037 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:17:38.352010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:17:38.354577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:17:38.352662 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:22:38.377471 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:22:38.377346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:22:38.381531 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:22:38.378788 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:27:36.820749 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:36.820709 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:27:36.821275 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:36.820952 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" podUID="ff967ac9-acbb-4159-904d-367dc3f8d6db" containerName="manager" containerID="cri-o://67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0" gracePeriod=10 Apr 20 15:27:37.281520 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.281493 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:27:37.397710 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.397630 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume\") pod \"ff967ac9-acbb-4159-904d-367dc3f8d6db\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " Apr 20 15:27:37.397855 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.397749 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmxs9\" (UniqueName: \"kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9\") pod \"ff967ac9-acbb-4159-904d-367dc3f8d6db\" (UID: \"ff967ac9-acbb-4159-904d-367dc3f8d6db\") " Apr 20 15:27:37.398031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.398006 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ff967ac9-acbb-4159-904d-367dc3f8d6db" (UID: "ff967ac9-acbb-4159-904d-367dc3f8d6db"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:27:37.399791 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.399766 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9" (OuterVolumeSpecName: "kube-api-access-lmxs9") pod "ff967ac9-acbb-4159-904d-367dc3f8d6db" (UID: "ff967ac9-acbb-4159-904d-367dc3f8d6db"). InnerVolumeSpecName "kube-api-access-lmxs9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:27:37.439388 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.439352 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff967ac9-acbb-4159-904d-367dc3f8d6db" containerID="67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0" exitCode=0 Apr 20 15:27:37.439577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.439424 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" Apr 20 15:27:37.439577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.439440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" event={"ID":"ff967ac9-acbb-4159-904d-367dc3f8d6db","Type":"ContainerDied","Data":"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0"} Apr 20 15:27:37.439577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.439483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h" event={"ID":"ff967ac9-acbb-4159-904d-367dc3f8d6db","Type":"ContainerDied","Data":"68870cbfe1d45fbba7bcc911efb37f9d9dfde7d6aa81262a2af80772c1564317"} Apr 20 15:27:37.439577 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.439499 2577 scope.go:117] "RemoveContainer" containerID="67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0" Apr 20 15:27:37.448749 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.448731 2577 scope.go:117] "RemoveContainer" containerID="67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0" Apr 20 15:27:37.449028 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:27:37.449008 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0\": container with ID starting with 67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0 not found: ID does not exist" containerID="67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0" Apr 20 15:27:37.449090 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.449038 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0"} err="failed to get container status \"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0\": rpc error: code = NotFound desc = could not find container \"67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0\": container with ID starting with 67ce7af1a19802a51e734c0be312d5a2c8f1be51b2e2321c66966d4a1742b7b0 not found: ID does not exist" Apr 20 15:27:37.461956 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.461928 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:27:37.467352 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.467317 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4x42h"] Apr 20 15:27:37.499081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.499058 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmxs9\" (UniqueName: \"kubernetes.io/projected/ff967ac9-acbb-4159-904d-367dc3f8d6db-kube-api-access-lmxs9\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:27:37.499081 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:37.499080 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff967ac9-acbb-4159-904d-367dc3f8d6db-extensions-socket-volume\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:27:38.197311 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:38.197260 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff967ac9-acbb-4159-904d-367dc3f8d6db" path="/var/lib/kubelet/pods/ff967ac9-acbb-4159-904d-367dc3f8d6db/volumes" Apr 20 15:27:38.408331 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:38.408217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:27:38.423719 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:27:38.410702 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:28:42.929655 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.929616 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf"] Apr 20 15:28:42.930147 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930129 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff967ac9-acbb-4159-904d-367dc3f8d6db" containerName="manager" Apr 20 15:28:42.930193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930150 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff967ac9-acbb-4159-904d-367dc3f8d6db" containerName="manager" Apr 20 15:28:42.930193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930163 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930171 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930184 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930193 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930194 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930384 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930203 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930384 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930208 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930384 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930301 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930384 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930310 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:28:42.930384 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.930317 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff967ac9-acbb-4159-904d-367dc3f8d6db" containerName="manager" Apr 20 15:28:42.933431 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.933412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:42.936827 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.936804 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8b6tn\"" Apr 20 15:28:42.951147 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:42.951122 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf"] Apr 20 15:28:43.075441 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.075410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxll\" (UniqueName: \"kubernetes.io/projected/5966e921-a60d-4714-abaf-264155bfde69-kube-api-access-kzxll\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.075622 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.075447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5966e921-a60d-4714-abaf-264155bfde69-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.176405 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.176365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxll\" (UniqueName: \"kubernetes.io/projected/5966e921-a60d-4714-abaf-264155bfde69-kube-api-access-kzxll\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.176405 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.176410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5966e921-a60d-4714-abaf-264155bfde69-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.176835 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.176810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5966e921-a60d-4714-abaf-264155bfde69-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.187211 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.187154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxll\" (UniqueName: \"kubernetes.io/projected/5966e921-a60d-4714-abaf-264155bfde69-kube-api-access-kzxll\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6w4mf\" (UID: \"5966e921-a60d-4714-abaf-264155bfde69\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.244965 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.244937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.376520 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.376493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf"] Apr 20 15:28:43.379291 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:28:43.379260 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5966e921_a60d_4714_abaf_264155bfde69.slice/crio-be6c8ef00bcbf5520a5e1077bc57775ddc29bb45f61bc4d0bcd93ffe1ddb4cdd WatchSource:0}: Error finding container be6c8ef00bcbf5520a5e1077bc57775ddc29bb45f61bc4d0bcd93ffe1ddb4cdd: Status 404 returned error can't find the container with id be6c8ef00bcbf5520a5e1077bc57775ddc29bb45f61bc4d0bcd93ffe1ddb4cdd Apr 20 15:28:43.381546 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.381528 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:28:43.685322 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.685262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" event={"ID":"5966e921-a60d-4714-abaf-264155bfde69","Type":"ContainerStarted","Data":"09315242a87458314e9389f44270b154a78d6ed2a17df1199a9c588ec3df4fa5"} Apr 20 15:28:43.685322 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.685317 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" event={"ID":"5966e921-a60d-4714-abaf-264155bfde69","Type":"ContainerStarted","Data":"be6c8ef00bcbf5520a5e1077bc57775ddc29bb45f61bc4d0bcd93ffe1ddb4cdd"} Apr 20 15:28:43.685542 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.685462 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:28:43.703479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:43.703409 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" podStartSLOduration=1.7033949430000002 podStartE2EDuration="1.703394943s" podCreationTimestamp="2026-04-20 15:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:28:43.702551518 +0000 UTC m=+1574.138095672" watchObservedRunningTime="2026-04-20 15:28:43.703394943 +0000 UTC m=+1574.138939096" Apr 20 15:28:54.692127 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:28:54.692096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6w4mf" Apr 20 15:30:00.129503 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.129469 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:30:00.129956 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.129909 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f28beb24-a23c-4127-9919-1ac38c95493f" containerName="cleanup" Apr 20 15:30:00.132920 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.132904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:30:00.135324 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.135280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5m5vb\"" Apr 20 15:30:00.140013 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.139990 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:30:00.140533 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.140511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8k9x\" (UniqueName: \"kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x\") pod \"maas-api-key-cleanup-29611650-4smds\" (UID: \"18a53a42-adfd-4ef4-811b-b8e78d18e3b1\") " pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:30:00.241595 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.241564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8k9x\" (UniqueName: \"kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x\") pod \"maas-api-key-cleanup-29611650-4smds\" (UID: \"18a53a42-adfd-4ef4-811b-b8e78d18e3b1\") " pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:30:00.250616 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.250583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8k9x\" (UniqueName: \"kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x\") pod \"maas-api-key-cleanup-29611650-4smds\" (UID: \"18a53a42-adfd-4ef4-811b-b8e78d18e3b1\") " pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:30:00.444963 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.444867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:30:00.572336 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.572309 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:30:00.573831 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:30:00.573802 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a53a42_adfd_4ef4_811b_b8e78d18e3b1.slice/crio-4adc4b31fde62eee3fbecfe9c8e67c49112e482c1caa58f896850f9558bbbc44 WatchSource:0}: Error finding container 4adc4b31fde62eee3fbecfe9c8e67c49112e482c1caa58f896850f9558bbbc44: Status 404 returned error can't find the container with id 4adc4b31fde62eee3fbecfe9c8e67c49112e482c1caa58f896850f9558bbbc44 Apr 20 15:30:00.958859 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.958826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerStarted","Data":"78def1c894a3e81492b707e39545a60f20c7943d7966ce45c950c5c92d68e00e"} Apr 20 15:30:00.958859 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.958862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerStarted","Data":"4adc4b31fde62eee3fbecfe9c8e67c49112e482c1caa58f896850f9558bbbc44"} Apr 20 15:30:00.973658 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:00.973613 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" podStartSLOduration=0.973598438 podStartE2EDuration="973.598438ms" podCreationTimestamp="2026-04-20 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:30:00.972585786 +0000 UTC m=+1651.408129940" watchObservedRunningTime="2026-04-20 15:30:00.973598438 +0000 UTC m=+1651.409142644" Apr 20 15:30:22.034929 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:22.034890 2577 generic.go:358] "Generic (PLEG): container finished" podID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerID="78def1c894a3e81492b707e39545a60f20c7943d7966ce45c950c5c92d68e00e" exitCode=6 Apr 20 15:30:22.035445 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:22.034935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerDied","Data":"78def1c894a3e81492b707e39545a60f20c7943d7966ce45c950c5c92d68e00e"} Apr 20 15:30:22.035445 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:22.035315 2577 scope.go:117] "RemoveContainer" containerID="78def1c894a3e81492b707e39545a60f20c7943d7966ce45c950c5c92d68e00e" Apr 20 15:30:23.039604 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:23.039571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerStarted","Data":"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613"} Apr 20 15:30:43.123848 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:43.123762 2577 generic.go:358] "Generic (PLEG): container finished" podID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" exitCode=6 Apr 20 15:30:43.124377 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:43.123836 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerDied","Data":"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613"} Apr 20 15:30:43.124377 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:43.123887 2577 scope.go:117] "RemoveContainer" containerID="78def1c894a3e81492b707e39545a60f20c7943d7966ce45c950c5c92d68e00e" Apr 20 15:30:43.124377 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:43.124204 2577 scope.go:117] "RemoveContainer" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" Apr 20 15:30:43.124554 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:30:43.124478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611650-4smds_opendatahub(18a53a42-adfd-4ef4-811b-b8e78d18e3b1)\"" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" Apr 20 15:30:55.191817 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:55.191786 2577 scope.go:117] "RemoveContainer" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" Apr 20 15:30:56.175308 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:56.175252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerStarted","Data":"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c"} Apr 20 15:30:56.216141 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:56.216108 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:30:57.178803 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:30:57.178745 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" containerID="cri-o://e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c" gracePeriod=30 Apr 20 15:31:16.028219 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.028194 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:31:16.064948 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.064917 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8k9x\" (UniqueName: \"kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x\") pod \"18a53a42-adfd-4ef4-811b-b8e78d18e3b1\" (UID: \"18a53a42-adfd-4ef4-811b-b8e78d18e3b1\") " Apr 20 15:31:16.066901 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.066874 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x" (OuterVolumeSpecName: "kube-api-access-x8k9x") pod "18a53a42-adfd-4ef4-811b-b8e78d18e3b1" (UID: "18a53a42-adfd-4ef4-811b-b8e78d18e3b1"). InnerVolumeSpecName "kube-api-access-x8k9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:31:16.166376 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.166280 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8k9x\" (UniqueName: \"kubernetes.io/projected/18a53a42-adfd-4ef4-811b-b8e78d18e3b1-kube-api-access-x8k9x\") on node \"ip-10-0-129-115.ec2.internal\" DevicePath \"\"" Apr 20 15:31:16.243519 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.243485 2577 generic.go:358] "Generic (PLEG): container finished" podID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerID="e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c" exitCode=6 Apr 20 15:31:16.243654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.243553 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" Apr 20 15:31:16.243654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.243575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerDied","Data":"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c"} Apr 20 15:31:16.243654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.243612 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611650-4smds" event={"ID":"18a53a42-adfd-4ef4-811b-b8e78d18e3b1","Type":"ContainerDied","Data":"4adc4b31fde62eee3fbecfe9c8e67c49112e482c1caa58f896850f9558bbbc44"} Apr 20 15:31:16.243654 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.243627 2577 scope.go:117] "RemoveContainer" containerID="e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c" Apr 20 15:31:16.252367 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.252349 2577 scope.go:117] "RemoveContainer" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" Apr 20 15:31:16.258989 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.258968 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:31:16.259828 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.259795 2577 scope.go:117] "RemoveContainer" containerID="e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c" Apr 20 15:31:16.260072 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:31:16.260047 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c\": container with ID starting with e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c not found: ID does not exist" containerID="e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c" Apr 20 15:31:16.260122 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.260075 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c"} err="failed to get container status \"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c\": rpc error: code = NotFound desc = could not find container \"e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c\": container with ID starting with e257c42f0cd2e06431c03875d3074d9240015c5b8b51955b01496b9370729f8c not found: ID does not exist" Apr 20 15:31:16.260122 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.260092 2577 scope.go:117] "RemoveContainer" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" Apr 20 15:31:16.260347 ip-10-0-129-115 kubenswrapper[2577]: E0420 15:31:16.260328 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613\": container with ID starting with 95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613 not found: ID does not exist" containerID="95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613" Apr 20 15:31:16.260391 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.260354 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613"} err="failed to get container status \"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613\": rpc error: code = NotFound desc = could not find container \"95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613\": container with ID starting with 95eb2d00c3d6b9459bc2d00145c452d39e783eac03340bb54674eed3f603c613 not found: ID does not exist" Apr 20 15:31:16.264495 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:16.264475 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611650-4smds"] Apr 20 15:31:18.196262 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:31:18.196230 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" path="/var/lib/kubelet/pods/18a53a42-adfd-4ef4-811b-b8e78d18e3b1/volumes" Apr 20 15:32:38.437868 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:32:38.437753 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:32:38.441447 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:32:38.441423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:37:38.465178 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:37:38.465143 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:37:38.468898 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:37:38.468877 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:38:28.263506 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:28.263427 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-282sq_f586be21-da87-4be6-a898-fc8c56047904/manager/0.log" Apr 20 15:38:29.950378 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:29.950341 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-n62pm_3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee/manager/0.log" Apr 20 15:38:30.059450 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:30.059419 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-nb47l_fd16d6fe-db4f-44b0-bc6c-b1bb4340172a/manager/0.log" Apr 20 15:38:30.170540 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:30.170509 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-wd94m_db3f3f71-699c-432f-a05e-dadebeccd795/kuadrant-console-plugin/0.log" Apr 20 15:38:30.296608 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:30.296579 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wh4hx_34a47557-4f99-476a-b736-08f8ffe4db64/registry-server/0.log" Apr 20 15:38:30.416452 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:30.416416 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-6w4mf_5966e921-a60d-4714-abaf-264155bfde69/manager/0.log" Apr 20 15:38:31.007469 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:31.007438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf_ef984f14-1549-4653-b515-f92a6fa0b180/istio-proxy/0.log" Apr 20 15:38:31.573700 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:31.573670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8649c78dc4-vwfc7_a56509b8-31c7-4b3a-b397-2eff1d2f128c/router/0.log" Apr 20 15:38:32.023604 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.023562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw_980008de-3ac8-4794-8355-81ecf568e2d3/storage-initializer/0.log" Apr 20 15:38:32.030329 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.030301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-wsdbw_980008de-3ac8-4794-8355-81ecf568e2d3/main/0.log" Apr 20 15:38:32.140632 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.140597 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-72sb8_ea02a932-86e0-416f-9299-9fc5ffeae749/storage-initializer/0.log" Apr 20 15:38:32.147145 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.147125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-72sb8_ea02a932-86e0-416f-9299-9fc5ffeae749/main/0.log" Apr 20 15:38:32.504338 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.504308 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl_86d3a929-df38-4ee6-993c-de18d9db2ae8/storage-initializer/0.log" Apr 20 15:38:32.511606 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:32.511581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-7h6wl_86d3a929-df38-4ee6-993c-de18d9db2ae8/main/0.log" Apr 20 15:38:39.235175 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:39.235131 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2vwvt_eaf0e44b-a5e1-426a-8cc3-36a330c48389/global-pull-secret-syncer/0.log" Apr 20 15:38:39.314837 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:39.314804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kztg4_dfd40e78-07ee-46ff-90d1-6f4a6d4baa55/konnectivity-agent/0.log" Apr 20 15:38:39.395465 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:39.395435 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-115.ec2.internal_6ec3cd5b714ec0026f823b211f468c89/haproxy/0.log" Apr 20 15:38:44.175118 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:44.175086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-n62pm_3bacaaa3-bf99-43e3-ac8d-b4f397c4c9ee/manager/0.log" Apr 20 15:38:44.200096 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:44.200066 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-nb47l_fd16d6fe-db4f-44b0-bc6c-b1bb4340172a/manager/0.log" Apr 20 15:38:44.229971 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:44.229943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-wd94m_db3f3f71-699c-432f-a05e-dadebeccd795/kuadrant-console-plugin/0.log" Apr 20 15:38:44.263030 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:44.262979 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wh4hx_34a47557-4f99-476a-b736-08f8ffe4db64/registry-server/0.log" Apr 20 15:38:44.328418 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:44.328380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-6w4mf_5966e921-a60d-4714-abaf-264155bfde69/manager/0.log" Apr 20 15:38:46.166518 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.166490 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-f684b8c45-5dpm6_58db9f7d-153a-4038-be2f-db349004a778/metrics-server/0.log" Apr 20 15:38:46.190083 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.190055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-f2vbx_96aa1153-b6d1-41da-b998-8cb9a9e6f56a/monitoring-plugin/0.log" Apr 20 15:38:46.290816 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.290789 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/node-exporter/0.log" Apr 20 15:38:46.309810 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.309782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/kube-rbac-proxy/0.log" Apr 20 15:38:46.329739 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.329714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bhvzv_65bc4e5e-758a-43c0-abbc-de79866d22b8/init-textfile/0.log" Apr 20 15:38:46.661112 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.661047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fdghk_c85d567e-27dc-4cfa-bcf6-f4b3181c4e62/prometheus-operator/0.log" Apr 20 15:38:46.680413 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.680375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fdghk_c85d567e-27dc-4cfa-bcf6-f4b3181c4e62/kube-rbac-proxy/0.log" Apr 20 15:38:46.703021 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:46.702990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xkh5p_62be979d-08db-4830-a936-48380c484f67/prometheus-operator-admission-webhook/0.log" Apr 20 15:38:47.881077 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881043 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq"] Apr 20 15:38:47.881479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881432 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881444 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881465 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881471 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881479 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881480 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881668 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881486 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881668 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881543 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.881668 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.881552 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="18a53a42-adfd-4ef4-811b-b8e78d18e3b1" containerName="cleanup" Apr 20 15:38:47.884657 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.884641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:47.886904 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.886875 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"kube-root-ca.crt\"" Apr 20 15:38:47.887057 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.886930 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"openshift-service-ca.crt\"" Apr 20 15:38:47.887057 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.886887 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-58p5q\"/\"default-dockercfg-qgplg\"" Apr 20 15:38:47.890667 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:47.890640 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq"] Apr 20 15:38:48.046744 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.046705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wg6g\" (UniqueName: \"kubernetes.io/projected/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-kube-api-access-4wg6g\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.046924 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.046753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-podres\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.046924 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.046816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-proc\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.046924 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.046880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-sys\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.047031 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.046960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-lib-modules\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147652 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-lib-modules\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147652 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wg6g\" (UniqueName: \"kubernetes.io/projected/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-kube-api-access-4wg6g\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-podres\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-proc\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-sys\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-lib-modules\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-proc\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-podres\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.147851 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.147791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-sys\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.155242 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.155212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wg6g\" (UniqueName: \"kubernetes.io/projected/ec8c13f9-2bc4-44ce-80c5-0d287b2e5949-kube-api-access-4wg6g\") pod \"perf-node-gather-daemonset-4mgpq\" (UID: \"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.195825 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.195797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.319877 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.319831 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq"] Apr 20 15:38:48.323839 ip-10-0-129-115 kubenswrapper[2577]: W0420 15:38:48.323799 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podec8c13f9_2bc4_44ce_80c5_0d287b2e5949.slice/crio-08c2a4ea37cc19e955cd522fd81d3aa02a388e5f1041dac50287b5f3eba9fb82 WatchSource:0}: Error finding container 08c2a4ea37cc19e955cd522fd81d3aa02a388e5f1041dac50287b5f3eba9fb82: Status 404 returned error can't find the container with id 08c2a4ea37cc19e955cd522fd81d3aa02a388e5f1041dac50287b5f3eba9fb82 Apr 20 15:38:48.325817 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.325798 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:38:48.911953 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.911915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" event={"ID":"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949","Type":"ContainerStarted","Data":"d476f93b6ac1614f215224ddc693e1c6d80b9213b1dadd5834d3cc52d4ceb344"} Apr 20 15:38:48.911953 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.911952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" event={"ID":"ec8c13f9-2bc4-44ce-80c5-0d287b2e5949","Type":"ContainerStarted","Data":"08c2a4ea37cc19e955cd522fd81d3aa02a388e5f1041dac50287b5f3eba9fb82"} Apr 20 15:38:48.912532 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.912025 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:48.927685 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.927642 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" podStartSLOduration=1.9276270210000002 podStartE2EDuration="1.927627021s" podCreationTimestamp="2026-04-20 15:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:38:48.925895457 +0000 UTC m=+2179.361439611" watchObservedRunningTime="2026-04-20 15:38:48.927627021 +0000 UTC m=+2179.363171230" Apr 20 15:38:48.954680 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.954658 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79975f5f76-splr8_017c3a70-c30c-4c24-a18f-60944ab58b29/console/0.log" Apr 20 15:38:48.983308 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:48.983255 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-c2qbw_9c3cc9ef-d0f3-4ac1-b2cb-c8a54c30ac41/download-server/0.log" Apr 20 15:38:49.497601 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:49.497561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dsskp_4d9bf241-422c-4da3-9dc2-cb3f82bb8201/volume-data-source-validator/0.log" Apr 20 15:38:50.287816 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.287785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rwbpz_548662da-3b01-418f-b71e-7805525a03e5/dns/0.log" Apr 20 15:38:50.307075 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.307046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rwbpz_548662da-3b01-418f-b71e-7805525a03e5/kube-rbac-proxy/0.log" Apr 20 15:38:50.367801 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.367774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc5nc_423472ab-8267-4409-9aa0-f1d4a9c14e79/dns-node-resolver/0.log" Apr 20 15:38:50.833526 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.833461 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b959fb7fc-fmqvl_36c8144f-abb6-49c5-b926-7737d427688b/registry/1.log" Apr 20 15:38:50.840859 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.840837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b959fb7fc-fmqvl_36c8144f-abb6-49c5-b926-7737d427688b/registry/2.log" Apr 20 15:38:50.858552 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:50.858520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7vhjb_3b19ab41-5f62-4594-9262-8789718fb9e9/node-ca/0.log" Apr 20 15:38:51.725748 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:51.725716 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf9rzzf_ef984f14-1549-4653-b515-f92a6fa0b180/istio-proxy/0.log" Apr 20 15:38:51.938515 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:51.938482 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8649c78dc4-vwfc7_a56509b8-31c7-4b3a-b397-2eff1d2f128c/router/0.log" Apr 20 15:38:52.424634 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:52.424604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7rhcs_cbcbf670-2941-48c2-8a4a-b5f253135d10/serve-healthcheck-canary/0.log" Apr 20 15:38:52.923700 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:52.923665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mgv7v_23ae8320-d3f3-4b75-8ae2-136783ad218b/insights-operator/0.log" Apr 20 15:38:52.926205 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:52.926180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mgv7v_23ae8320-d3f3-4b75-8ae2-136783ad218b/insights-operator/1.log" Apr 20 15:38:53.074741 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:53.074709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mrp8_f34a05dd-26aa-4559-a6d7-361a7c4e19c8/kube-rbac-proxy/0.log" Apr 20 15:38:53.095778 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:53.095749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mrp8_f34a05dd-26aa-4559-a6d7-361a7c4e19c8/exporter/0.log" Apr 20 15:38:53.115699 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:53.115668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mrp8_f34a05dd-26aa-4559-a6d7-361a7c4e19c8/extractor/0.log" Apr 20 15:38:54.926838 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:54.926811 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-4mgpq" Apr 20 15:38:55.083814 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:55.083782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-282sq_f586be21-da87-4be6-a898-fc8c56047904/manager/0.log" Apr 20 15:38:56.326741 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:38:56.326715 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b8584f779-646fk_e68b109d-9188-4f00-965a-775902235d56/manager/0.log" Apr 20 15:39:00.709943 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:00.709906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c8wr7_c1c75800-433e-4b2e-a48d-b70c37d34e5a/migrator/0.log" Apr 20 15:39:00.730943 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:00.730914 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c8wr7_c1c75800-433e-4b2e-a48d-b70c37d34e5a/graceful-termination/0.log" Apr 20 15:39:02.140783 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.140750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/kube-multus-additional-cni-plugins/0.log" Apr 20 15:39:02.159564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.159528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/egress-router-binary-copy/0.log" Apr 20 15:39:02.178795 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.178764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/cni-plugins/0.log" Apr 20 15:39:02.198717 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.198686 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/bond-cni-plugin/0.log" Apr 20 15:39:02.219613 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.219578 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/routeoverride-cni/0.log" Apr 20 15:39:02.240511 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.240480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/whereabouts-cni-bincopy/0.log" Apr 20 15:39:02.262564 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.262532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cx4lv_a42bcb4e-3ab0-49cb-8302-c0f2152edc3d/whereabouts-cni/0.log" Apr 20 15:39:02.536451 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.536421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mm98x_f68a7ade-f858-4f5d-b2ac-1ea4270c1737/kube-multus/0.log" Apr 20 15:39:02.611190 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.611161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dp887_5987592a-660d-4466-bbc4-5bd812cca838/network-metrics-daemon/0.log" Apr 20 15:39:02.631396 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:02.631363 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dp887_5987592a-660d-4466-bbc4-5bd812cca838/kube-rbac-proxy/0.log" Apr 20 15:39:04.384802 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.384723 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-controller/0.log" Apr 20 15:39:04.403255 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.403225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/0.log" Apr 20 15:39:04.412656 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.412634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovn-acl-logging/1.log" Apr 20 15:39:04.431752 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.431725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/kube-rbac-proxy-node/0.log" Apr 20 15:39:04.451674 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.451645 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:39:04.468211 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.468176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/northd/0.log" Apr 20 15:39:04.493134 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.493109 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/nbdb/0.log" Apr 20 15:39:04.512630 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.512607 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/sbdb/0.log" Apr 20 15:39:04.604085 ip-10-0-129-115 kubenswrapper[2577]: I0420 15:39:04.604055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9pxn_65dfd729-42e3-45f2-8f76-eec9fa62c8c4/ovnkube-controller/0.log"