Apr 24 21:24:55.969522 ip-10-0-136-160 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:55.969531 ip-10-0-136-160 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:55.969538 ip-10-0-136-160 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:55.969742 ip-10-0-136-160 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:06.004457 ip-10-0-136-160 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:06.004501 ip-10-0-136-160 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bc496b5d9b994107947032bda00f8fc3 -- Apr 24 21:27:37.949677 ip-10-0-136-160 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:38.351498 ip-10-0-136-160 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:38.351498 ip-10-0-136-160 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:38.351498 ip-10-0-136-160 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:38.351498 ip-10-0-136-160 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:38.351498 ip-10-0-136-160 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:38.353016 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.352892 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:38.356576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356562 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:38.356576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356576 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356580 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356583 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356585 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356588 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356591 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356594 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356596 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356599 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356602 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356605 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356607 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356610 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356612 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356615 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356617 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356620 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356631 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356636 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:38.356642 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356639 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356641 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356644 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356648 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356651 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356654 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356656 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356659 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356662 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356664 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356667 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356669 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356671 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356674 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356676 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356679 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356681 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356684 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356686 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356689 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:38.357078 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356691 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356694 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356696 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356699 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356721 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356725 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356728 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356731 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356734 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356736 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356739 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356742 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356744 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356746 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356750 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356753 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356756 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356758 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356761 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356763 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:38.357623 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356766 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356769 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356772 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356775 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356778 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356780 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356783 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356786 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356788 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356791 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356794 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356797 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356800 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356802 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356805 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356807 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356811 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356815 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356817 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:38.358096 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356820 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356823 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356826 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356829 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356831 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356834 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.356837 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357863 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357870 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357873 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357877 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357879 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357882 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357885 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357889 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357892 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357895 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357899 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357902 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357904 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:38.358594 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357907 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357910 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357912 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357916 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357919 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357922 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357924 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357927 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357930 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357932 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357934 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357937 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357939 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357941 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357944 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357946 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357949 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357952 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357955 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357958 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:38.359062 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357961 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357963 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357966 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357968 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357971 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357973 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357975 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357978 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357980 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357983 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357985 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357988 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.357990 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358009 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358014 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358016 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358022 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358025 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358027 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:38.359568 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358030 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358033 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358035 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358038 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358040 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358043 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358045 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358047 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358050 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358053 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358055 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358058 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358061 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358063 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358066 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358068 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358071 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358074 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358077 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358079 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:38.360057 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358082 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358084 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358086 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358089 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358091 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358093 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358096 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358098 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358101 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358103 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358106 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358109 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358112 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358114 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358177 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358183 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358191 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358196 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358200 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358204 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358211 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:38.360547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358218 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358222 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358227 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358232 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358236 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358239 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358242 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358244 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358247 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358251 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358253 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358256 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358261 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358264 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358267 2575 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358270 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358273 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358277 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358280 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358283 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358286 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358289 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358292 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358295 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358298 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:38.361048 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358301 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358305 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358312 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358315 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358318 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358321 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358324 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358328 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358331 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358333 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358336 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358339 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358343 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358345 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358348 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358351 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358354 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358357 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358359 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358362 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358368 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358371 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358374 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358377 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358380 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:38.361695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358383 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358386 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358389 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358392 2575 flags.go:64] FLAG: --help="false" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358395 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358398 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358401 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358404 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358407 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358412 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358428 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358431 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358434 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358437 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358440 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358443 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358446 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358449 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358451 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358454 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358457 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358460 2575 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358463 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358465 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:38.362281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358469 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358474 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358476 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358480 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358483 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358486 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358489 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358492 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358495 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358499 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358502 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358506 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358508 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358511 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358514 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358517 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358520 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358525 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358528 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358535 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358537 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358541 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358543 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:38.362860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358546 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358552 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358555 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358558 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358560 2575 flags.go:64] FLAG: --port="10250" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358563 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358566 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-021fe24be6a87eb2e" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358569 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358572 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358575 2575 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358578 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358580 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358584 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358588 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358591 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358594 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358597 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358600 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358604 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358606 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358609 2575 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358612 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358615 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358618 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358622 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358625 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:38.363389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358629 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358632 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358635 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358638 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358640 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358643 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358646 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358649 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358652 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358654 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358660 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358662 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358665 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358670 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358672 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358675 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358678 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358681 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358683 2575 flags.go:64] FLAG: --v="2" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358688 2575 flags.go:64] FLAG: --version="false" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358692 2575 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358696 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.358700 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358788 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358792 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:38.364013 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358795 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358797 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358800 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358803 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358806 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358808 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358812 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358816 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358818 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358821 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358823 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358826 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358828 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358831 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358833 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358836 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358838 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358841 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358843 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:38.364616 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358846 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358850 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358853 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358856 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358858 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358861 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358864 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358867 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358870 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358873 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358877 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358880 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358883 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358885 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358887 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358890 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358892 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358895 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358898 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:38.365087 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358900 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358904 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358907 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358909 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358911 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358914 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358916 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358919 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358921 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358923 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358926 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358928 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358931 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358933 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358935 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358938 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358940 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358943 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358945 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:38.365576 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358948 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358951 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358953 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358956 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358958 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358962 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358964 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358966 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358969 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358971 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358974 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358976 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358979 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358981 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358985 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358987 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358990 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358992 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358995 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358997 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:38.366025 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.358999 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359002 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359004 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359006 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359009 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359012 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.359014 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.359583 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.365742 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.365755 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365799 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365804 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365807 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365810 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365814 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365817 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:38.366522 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365820 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365824 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365828 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365831 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365833 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365836 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365839 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365841 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365844 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365846 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365849 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365852 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365854 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365857 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365859 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365863 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365867 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365870 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365873 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:38.366912 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365876 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365879 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365881 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365883 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365886 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365888 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365892 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365895 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365897 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365900 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365903 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365905 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365908 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365911 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365914 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365916 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365919 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365921 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365924 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365926 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:38.367366 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365929 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365931 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365933 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365936 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365938 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365942 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365945 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365947 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365950 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365952 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365955 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365958 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365960 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365963 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365965 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365967 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365970 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365972 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365975 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365978 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:38.367909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365980 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365983 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365985 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365987 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365990 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365992 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365994 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365997 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.365999 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366002 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366004 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366007 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366009 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366012 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366014 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366017 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366019 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366022 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366025 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366027 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:38.368384 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366030 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.366035 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366147 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366152 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366155 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366157 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366160 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366163 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366166 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366169 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366171 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366174 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366176 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366179 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366181 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:38.368907 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366183 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366186 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366188 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366191 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366193 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366195 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366197 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366200 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366202 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366204 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366207 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366209 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366211 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366214 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366216 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366218 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366221 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366224 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366226 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366229 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:38.369278 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366232 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366234 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366237 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366239 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366242 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366244 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366246 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366249 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366252 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366254 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366257 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366259 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366262 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366264 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366267 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366269 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366272 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366274 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366276 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:38.369780 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366279 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366281 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366283 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366286 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366288 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366291 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366293 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366297 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366300 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366302 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366305 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366308 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366311 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366313 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366317 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366320 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366323 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366325 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366327 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:38.370224 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366330 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366332 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366335 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366338 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366340 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366343 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366345 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366347 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366350 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366352 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366355 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366357 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366359 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366361 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:38.366364 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.366368 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:38.370680 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.367012 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:38.371065 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.370256 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:38.371117 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.371105 2575 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:38.371214 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.371200 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:38.371245 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.371238 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:38.392632 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.392616 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:38.397949 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.397924 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:38.407217 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.407200 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:38.412298 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.412283 2575 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:38.413443 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.413415 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:38.415662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.415643 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b75be8a6-a59a-4281-93fd-255d95672871:/dev/nvme0n1p4 fe707412-d217-4346-b9bd-deaca2aa9891:/dev/nvme0n1p3] Apr 24 21:27:38.415708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.415662 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:38.420981 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.420872 2575 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:38.419170884 +0000 UTC m=+0.359016608 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200037 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fce35a6971d18cbd138ded0bd0c82 SystemUUID:ec2fce35-a697-1d18-cbd1-38ded0bd0c82 BootID:bc496b5d-9b99-4107-9470-32bda00f8fc3 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:9f:97:7f:4f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:9f:97:7f:4f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:04:be:ec:c4:98 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:38.420981 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.420981 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:38.421097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.421085 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:38.423337 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423321 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:38.423586 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423561 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:38.423725 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423591 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-160.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:38.423778 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423734 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:38.423778 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423743 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:38.423778 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.423757 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:38.424992 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.424982 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:38.425671 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.425661 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:38.425781 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.425772 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:38.427939 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.427929 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:38.427976 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.427947 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:38.427976 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.427961 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:38.427976 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.427969 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:38.428052 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.427977 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:38.429251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.429240 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:38.429290 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.429257 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:38.431938 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.431920 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:38.433138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.433125 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:38.434661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434646 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:38.434661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434663 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434669 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434680 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434686 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434691 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434697 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434702 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434709 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434714 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434723 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:38.434762 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.434732 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:38.435522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.435477 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:38.435522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.435523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:38.441409 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.441283 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:38.441505 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.441451 2575 server.go:1295] "Started kubelet" Apr 24 21:27:38.441591 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.441540 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:38.441662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.441618 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:38.441709 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.441687 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:38.442385 ip-10-0-136-160 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:38.442731 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.442704 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:38.442806 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.442753 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:38.443113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.442967 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-160.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:38.443113 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.443000 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:38.443113 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.443065 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:38.445960 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.445942 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h9kzj" Apr 24 21:27:38.447696 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.447674 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:38.447801 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.446783 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-160.ec2.internal.18a9682a33cdbe08 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-160.ec2.internal,UID:ip-10-0-136-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-160.ec2.internal,},FirstTimestamp:2026-04-24 21:27:38.441408008 +0000 UTC m=+0.381253732,LastTimestamp:2026-04-24 21:27:38.441408008 +0000 UTC m=+0.381253732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-160.ec2.internal,}" Apr 24 21:27:38.448202 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.448191 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:38.448954 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.448936 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:38.448954 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.448937 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:38.448954 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.448958 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:38.449137 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.449041 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:38.449137 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.449048 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:38.449311 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.449295 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.452111 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452012 2575 factory.go:55] Registering systemd factory Apr 24 21:27:38.452111 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452034 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:38.452616 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452513 2575 factory.go:153] Registering CRI-O factory Apr 24 21:27:38.452616 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452532 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:38.452616 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452585 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:38.452616 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452608 2575 factory.go:103] Registering Raw factory Apr 24 21:27:38.452813 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.452624 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:38.453071 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.453052 2575 manager.go:319] Starting recovery of all containers Apr 24 21:27:38.454725 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.454091 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:38.455153 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.455134 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h9kzj" Apr 24 21:27:38.460197 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.460161 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:38.460330 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.460309 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:38.461523 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.461502 2575 manager.go:324] Recovery completed Apr 24 21:27:38.466292 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.466279 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.468748 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.468733 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.468802 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.468767 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.468802 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.468781 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.469253 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.469238 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:38.469253 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.469250 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:38.469343 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.469264 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:38.471401 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.471390 2575 policy_none.go:49] "None policy: Start" Apr 24 21:27:38.471459 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.471406 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:38.471459 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.471434 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:38.508997 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.508983 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.509041 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509057 2575 server.go:85] "Starting device plugin registration server" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509237 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509245 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509433 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509545 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.509554 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.509962 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:38.517114 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.509995 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.576498 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.576463 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:38.577593 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.577574 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:38.577661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.577600 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:38.577661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.577622 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:38.577661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.577630 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:38.577661 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.577659 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:38.581061 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.581036 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:38.610010 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.609964 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.611531 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.611516 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.611606 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.611549 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.611606 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.611563 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.611606 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.611590 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.620457 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.620440 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.620511 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.620461 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-160.ec2.internal\": node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.635569 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.635547 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.678262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.678240 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal"] Apr 24 21:27:38.678350 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.678302 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.679924 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.679907 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.679995 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.679937 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.679995 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.679950 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.681266 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681253 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.681400 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.681451 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681414 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.681911 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681896 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.681979 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681896 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.681979 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681948 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.681979 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681959 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.682064 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.681927 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.682064 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.682004 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.683144 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.683131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.683186 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.683155 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:38.683766 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.683749 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:38.683870 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.683773 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:38.683870 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.683788 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:38.710938 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.710917 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-160.ec2.internal\" not found" node="ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.715006 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.714991 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-160.ec2.internal\" not found" node="ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.736155 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.736135 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.750666 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.750641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.750737 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.750671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.750737 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.750688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb3818d29d9daefc24a29562dc700e4-config\") pod \"kube-apiserver-proxy-ip-10-0-136-160.ec2.internal\" (UID: \"0eb3818d29d9daefc24a29562dc700e4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.836312 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.836295 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:38.851648 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.851708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.851708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb3818d29d9daefc24a29562dc700e4-config\") pod \"kube-apiserver-proxy-ip-10-0-136-160.ec2.internal\" (UID: \"0eb3818d29d9daefc24a29562dc700e4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.851708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb3818d29d9daefc24a29562dc700e4-config\") pod \"kube-apiserver-proxy-ip-10-0-136-160.ec2.internal\" (UID: \"0eb3818d29d9daefc24a29562dc700e4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.851803 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.851803 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:38.851729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67300e2a61c195d7f46d25ebcf08a36d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal\" (UID: \"67300e2a61c195d7f46d25ebcf08a36d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:38.936985 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:38.936960 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.013466 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.013445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:39.017001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.016985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:39.037601 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:39.037579 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.138133 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:39.138096 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.238612 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:39.238552 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.339105 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:39.339074 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.371554 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.371531 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:39.372091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.371679 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:39.440181 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:39.440151 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-160.ec2.internal\" not found" Apr 24 21:27:39.447954 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.447935 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:39.457149 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.457118 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:38 +0000 UTC" deadline="2027-10-27 05:16:31.711855592 +0000 UTC" Apr 24 21:27:39.457149 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.457148 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13207h48m52.25471106s" Apr 24 21:27:39.467049 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.467030 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:39.478637 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:39.478600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67300e2a61c195d7f46d25ebcf08a36d.slice/crio-e72b25121fc601f500c9046c18e939aa80ee148e463a365fb05c29c83ca9c601 WatchSource:0}: Error finding container e72b25121fc601f500c9046c18e939aa80ee148e463a365fb05c29c83ca9c601: Status 404 returned error can't find the container with id e72b25121fc601f500c9046c18e939aa80ee148e463a365fb05c29c83ca9c601 Apr 24 21:27:39.478909 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:39.478890 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb3818d29d9daefc24a29562dc700e4.slice/crio-0f0765ab386e5216bc123e0f306bb8fde07c5566a63c1ae5adc92313aebe7cb9 WatchSource:0}: Error finding container 0f0765ab386e5216bc123e0f306bb8fde07c5566a63c1ae5adc92313aebe7cb9: Status 404 returned error can't find the container with id 0f0765ab386e5216bc123e0f306bb8fde07c5566a63c1ae5adc92313aebe7cb9 Apr 24 21:27:39.483535 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.483522 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:39.487133 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.487117 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nq7jg" Apr 24 21:27:39.497600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.497554 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nq7jg" Apr 24 21:27:39.526585 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.526562 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:39.548830 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.548802 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" Apr 24 21:27:39.568137 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.568121 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:39.568968 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.568953 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:39.569097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.569085 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" Apr 24 21:27:39.578375 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.578353 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:39.579774 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.579742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" event={"ID":"67300e2a61c195d7f46d25ebcf08a36d","Type":"ContainerStarted","Data":"e72b25121fc601f500c9046c18e939aa80ee148e463a365fb05c29c83ca9c601"} Apr 24 21:27:39.580588 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.580571 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" event={"ID":"0eb3818d29d9daefc24a29562dc700e4","Type":"ContainerStarted","Data":"0f0765ab386e5216bc123e0f306bb8fde07c5566a63c1ae5adc92313aebe7cb9"} Apr 24 21:27:39.891687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:39.891605 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:40.212922 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.212891 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:40.429797 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.429760 2575 apiserver.go:52] "Watching apiserver" Apr 24 21:27:40.437136 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.437110 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:40.437526 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.437493 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r","openshift-cluster-node-tuning-operator/tuned-l5g4l","openshift-image-registry/node-ca-flsks","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal","openshift-multus/multus-nv6gl","openshift-multus/network-metrics-daemon-l78bh","openshift-network-diagnostics/network-check-target-2sm45","kube-system/konnectivity-agent-58dwf","openshift-dns/node-resolver-vdppq","openshift-multus/multus-additional-cni-plugins-kb7pn","openshift-network-operator/iptables-alerter-6hct6","openshift-ovn-kubernetes/ovnkube-node-zglkz"] Apr 24 21:27:40.440634 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.440611 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.440727 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.440701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:40.440795 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.440779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.441880 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.441858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.442901 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.442882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.443386 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.443365 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wv2wf\"" Apr 24 21:27:40.443794 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.443664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.443794 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.443671 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.443794 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.443671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:40.444295 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.444236 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.444295 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.444246 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z62gp\"" Apr 24 21:27:40.444554 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.444535 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.444856 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.444832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.444926 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.444871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.445385 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.445354 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.445509 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.445493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:40.445557 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.445530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-z47zn\"" Apr 24 21:27:40.445968 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.445872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:40.445968 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.445931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:40.446101 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.445990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.447107 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447084 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.447761 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447086 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:40.447761 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.447761 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447352 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:40.447761 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447565 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.447761 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.447589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pdptk\"" Apr 24 21:27:40.448184 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.448162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:40.448269 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.448257 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwdgz\"" Apr 24 21:27:40.448359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.448341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.448489 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.448473 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:40.449313 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.449293 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.449574 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.449554 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.449664 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.449554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285zj\"" Apr 24 21:27:40.450056 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.449940 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.452454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.450879 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:40.452454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.451165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:40.452454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.451733 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6n7r\"" Apr 24 21:27:40.452454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.451873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.453313 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.453293 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.453508 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.453335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.453823 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.453801 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2vjjw\"" Apr 24 21:27:40.453903 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.453824 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:40.454338 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.454163 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:40.454338 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.454321 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:40.454338 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.454326 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q8d98\"" Apr 24 21:27:40.454924 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.454752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:40.456128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.455967 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:40.456128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.455984 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:40.456128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.456037 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:40.459625 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-multus\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.459703 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:40.459703 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-os-release\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.459703 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-multus-certs\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.459703 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gh2\" (UniqueName: \"kubernetes.io/projected/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-kube-api-access-n7gh2\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb182b1c-327c-4054-8814-10769b9fc643-host-slash\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-conf-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-registration-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-device-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-netns\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.459853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysconfig\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jszf\" (UniqueName: \"kubernetes.io/projected/155ba158-2f39-4023-b916-b8d0af483d46-kube-api-access-5jszf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4327adce-270c-40d3-b3a2-3f3c1acfa545-host\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4327adce-270c-40d3-b3a2-3f3c1acfa545-serviceca\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-agent-certs\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-etc-selinux\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-bin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.459997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmtx\" (UniqueName: \"kubernetes.io/projected/bb182b1c-327c-4054-8814-10769b9fc643-kube-api-access-6wmtx\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-socket-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-system-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.460091 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-etc-kubernetes\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-systemd\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-daemon-config\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-modprobe-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-conf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-host\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-etc-tuned\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-os-release\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-kubelet\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-konnectivity-ca\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-var-lib-kubelet\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb182b1c-327c-4054-8814-10769b9fc643-iptables-alerter-script\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvqm\" (UniqueName: \"kubernetes.io/projected/01b5838c-cf53-4f25-8edb-f0bb7176b567-kube-api-access-spvqm\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.460603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-system-cni-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnpt\" (UniqueName: \"kubernetes.io/projected/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-kube-api-access-5cnpt\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cni-binary-copy\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvwc\" (UniqueName: \"kubernetes.io/projected/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-kube-api-access-ldvwc\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp69p\" (UniqueName: \"kubernetes.io/projected/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-kube-api-access-sp69p\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb59b\" (UniqueName: \"kubernetes.io/projected/4327adce-270c-40d3-b3a2-3f3c1acfa545-kube-api-access-qb59b\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-socket-dir-parent\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-k8s-cni-cncf-io\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-hostroot\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-hosts-file\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-sys-fs\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-sys\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-lib-modules\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.461237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-kubernetes\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.461687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.460972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-run\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.461687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.461029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cnibin\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.461687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.461058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cnibin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.461687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.461093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-tmp-dir\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.461687 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.461116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-tmp\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.498231 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.498208 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:39 +0000 UTC" deadline="2028-01-23 08:53:30.54093435 +0000 UTC" Apr 24 21:27:40.498231 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.498230 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15323h25m50.042706613s" Apr 24 21:27:40.550268 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.550243 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:40.562192 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-modprobe-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-conf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-host\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-etc-tuned\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-os-release\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-kubelet\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.562329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-konnectivity-ca\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-var-lib-kubelet\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-modprobe-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-netns\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-conf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb182b1c-327c-4054-8814-10769b9fc643-iptables-alerter-script\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-host\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-kubelet\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spvqm\" (UniqueName: \"kubernetes.io/projected/01b5838c-cf53-4f25-8edb-f0bb7176b567-kube-api-access-spvqm\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88ca30bc-f546-43f6-8751-e5c36307eb86-ovn-node-metrics-cert\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.562600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-kubelet\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysctl-d\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-system-cni-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-os-release\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cnpt\" (UniqueName: \"kubernetes.io/projected/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-kube-api-access-5cnpt\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562692 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cni-binary-copy\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvwc\" (UniqueName: \"kubernetes.io/projected/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-kube-api-access-ldvwc\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp69p\" (UniqueName: \"kubernetes.io/projected/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-kube-api-access-sp69p\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb59b\" (UniqueName: \"kubernetes.io/projected/4327adce-270c-40d3-b3a2-3f3c1acfa545-kube-api-access-qb59b\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-socket-dir-parent\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-k8s-cni-cncf-io\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-hostroot\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-hosts-file\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-sys-fs\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.562984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-sys\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-var-lib-kubelet\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-system-cni-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-hostroot\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-hosts-file\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-sys-fs\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-socket-dir-parent\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-k8s-cni-cncf-io\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-sys\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-konnectivity-ca\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-lib-modules\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cni-binary-copy\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-kubernetes\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-run\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.563763 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:40.563894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb182b1c-327c-4054-8814-10769b9fc643-iptables-alerter-script\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-var-lib-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-run\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-lib-modules\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-kubernetes\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.563856 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:41.063810633 +0000 UTC m=+3.003656387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cnibin\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cnibin\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cnibin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-cnibin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-tmp-dir\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.563978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-tmp\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-ovn\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-log-socket\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-bin\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-multus\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.564723 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-os-release\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-multus-certs\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-multus\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gh2\" (UniqueName: \"kubernetes.io/projected/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-kube-api-access-n7gh2\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb182b1c-327c-4054-8814-10769b9fc643-host-slash\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-os-release\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-tmp-dir\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-netd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-conf-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-conf-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-registration-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-multus-certs\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-device-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-registration-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.565440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-slash\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb182b1c-327c-4054-8814-10769b9fc643-host-slash\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-device-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-env-overrides\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-netns\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysconfig\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-run-netns\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jszf\" (UniqueName: \"kubernetes.io/projected/155ba158-2f39-4023-b916-b8d0af483d46-kube-api-access-5jszf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-systemd-units\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-sysconfig\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-config\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4327adce-270c-40d3-b3a2-3f3c1acfa545-host\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4327adce-270c-40d3-b3a2-3f3c1acfa545-serviceca\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-agent-certs\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.566053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-etc-selinux\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-etc-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qrg\" (UniqueName: \"kubernetes.io/projected/88ca30bc-f546-43f6-8751-e5c36307eb86-kube-api-access-22qrg\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-bin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmtx\" (UniqueName: \"kubernetes.io/projected/bb182b1c-327c-4054-8814-10769b9fc643-kube-api-access-6wmtx\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-socket-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-node-log\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-script-lib\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-system-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-etc-kubernetes\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4327adce-270c-40d3-b3a2-3f3c1acfa545-serviceca\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-systemd\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-systemd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.564878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4327adce-270c-40d3-b3a2-3f3c1acfa545-host\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-daemon-config\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-host-var-lib-cni-bin\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.566668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-etc-kubernetes\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-etc-selinux\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01b5838c-cf53-4f25-8edb-f0bb7176b567-socket-dir\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-system-cni-dir\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/155ba158-2f39-4023-b916-b8d0af483d46-etc-systemd\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-etc-tuned\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.565875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-multus-daemon-config\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.567208 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.566365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/155ba158-2f39-4023-b916-b8d0af483d46-tmp\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.567470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.567452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19fe13f9-cfb4-4b0f-8c65-000ccc157cbb-agent-certs\") pod \"konnectivity-agent-58dwf\" (UID: \"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb\") " pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.572132 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.572114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvqm\" (UniqueName: \"kubernetes.io/projected/01b5838c-cf53-4f25-8edb-f0bb7176b567-kube-api-access-spvqm\") pod \"aws-ebs-csi-driver-node-gbl5r\" (UID: \"01b5838c-cf53-4f25-8edb-f0bb7176b567\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.574949 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.574903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cnpt\" (UniqueName: \"kubernetes.io/projected/edd258c5-66bc-4d60-8302-4f99f9bfa7dc-kube-api-access-5cnpt\") pod \"multus-additional-cni-plugins-kb7pn\" (UID: \"edd258c5-66bc-4d60-8302-4f99f9bfa7dc\") " pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.575213 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.575115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvwc\" (UniqueName: \"kubernetes.io/projected/abbc168a-6ea4-427c-8c8d-16f6a126b2a8-kube-api-access-ldvwc\") pod \"multus-nv6gl\" (UID: \"abbc168a-6ea4-427c-8c8d-16f6a126b2a8\") " pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.575915 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.575891 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:40.576065 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.576053 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:40.576221 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.576209 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:40.576731 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:40.576656 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:41.076411318 +0000 UTC m=+3.016257053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:40.577690 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.577669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmtx\" (UniqueName: \"kubernetes.io/projected/bb182b1c-327c-4054-8814-10769b9fc643-kube-api-access-6wmtx\") pod \"iptables-alerter-6hct6\" (UID: \"bb182b1c-327c-4054-8814-10769b9fc643\") " pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.577809 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.577789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp69p\" (UniqueName: \"kubernetes.io/projected/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-kube-api-access-sp69p\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:40.577962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.577945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb59b\" (UniqueName: \"kubernetes.io/projected/4327adce-270c-40d3-b3a2-3f3c1acfa545-kube-api-access-qb59b\") pod \"node-ca-flsks\" (UID: \"4327adce-270c-40d3-b3a2-3f3c1acfa545\") " pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.578104 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.578084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jszf\" (UniqueName: \"kubernetes.io/projected/155ba158-2f39-4023-b916-b8d0af483d46-kube-api-access-5jszf\") pod \"tuned-l5g4l\" (UID: \"155ba158-2f39-4023-b916-b8d0af483d46\") " pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.578967 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.578946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gh2\" (UniqueName: \"kubernetes.io/projected/c69c2633-a089-45fd-9a6f-5b56c0d7beb1-kube-api-access-n7gh2\") pod \"node-resolver-vdppq\" (UID: \"c69c2633-a089-45fd-9a6f-5b56c0d7beb1\") " pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.666357 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666357 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-netd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-netd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-slash\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-env-overrides\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-slash\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-systemd-units\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-config\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-etc-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22qrg\" (UniqueName: \"kubernetes.io/projected/88ca30bc-f546-43f6-8751-e5c36307eb86-kube-api-access-22qrg\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-node-log\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-systemd-units\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-etc-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-script-lib\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-node-log\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-systemd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-netns\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88ca30bc-f546-43f6-8751-e5c36307eb86-ovn-node-metrics-cert\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-run-netns\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-kubelet\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-systemd\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-var-lib-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.666999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-ovn\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-kubelet\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-log-socket\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-var-lib-openvswitch\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-log-socket\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-run-ovn\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.666998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-bin\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.667027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-env-overrides\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.667040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88ca30bc-f546-43f6-8751-e5c36307eb86-host-cni-bin\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.667329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-config\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.667637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.667342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88ca30bc-f546-43f6-8751-e5c36307eb86-ovnkube-script-lib\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.669246 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.669231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88ca30bc-f546-43f6-8751-e5c36307eb86-ovn-node-metrics-cert\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.675005 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.674986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qrg\" (UniqueName: \"kubernetes.io/projected/88ca30bc-f546-43f6-8751-e5c36307eb86-kube-api-access-22qrg\") pod \"ovnkube-node-zglkz\" (UID: \"88ca30bc-f546-43f6-8751-e5c36307eb86\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:40.754165 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.754102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" Apr 24 21:27:40.761741 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.761718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" Apr 24 21:27:40.768352 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.768333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flsks" Apr 24 21:27:40.773914 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.773895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nv6gl" Apr 24 21:27:40.778469 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.778450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:27:40.784020 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.784004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdppq" Apr 24 21:27:40.790552 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.790532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" Apr 24 21:27:40.796978 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.796962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6hct6" Apr 24 21:27:40.801536 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:40.801519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:27:41.002369 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.002219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69c2633_a089_45fd_9a6f_5b56c0d7beb1.slice/crio-8d55f4a6b5fdcc9dca674f944dc5d7bd1b08b1e889f850db1a45ac8e9ea72071 WatchSource:0}: Error finding container 8d55f4a6b5fdcc9dca674f944dc5d7bd1b08b1e889f850db1a45ac8e9ea72071: Status 404 returned error can't find the container with id 8d55f4a6b5fdcc9dca674f944dc5d7bd1b08b1e889f850db1a45ac8e9ea72071 Apr 24 21:27:41.003335 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.003311 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b5838c_cf53_4f25_8edb_f0bb7176b567.slice/crio-bc46b256f98eeb32135a1679f4ab9e8d8f0986316af887619ec7380196330e5d WatchSource:0}: Error finding container bc46b256f98eeb32135a1679f4ab9e8d8f0986316af887619ec7380196330e5d: Status 404 returned error can't find the container with id bc46b256f98eeb32135a1679f4ab9e8d8f0986316af887619ec7380196330e5d Apr 24 21:27:41.005070 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.005034 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19fe13f9_cfb4_4b0f_8c65_000ccc157cbb.slice/crio-2696b1b90be9ceceebc2eabd605d825f83978f6105dd5acd6d639676caf0b6b6 WatchSource:0}: Error finding container 2696b1b90be9ceceebc2eabd605d825f83978f6105dd5acd6d639676caf0b6b6: Status 404 returned error can't find the container with id 2696b1b90be9ceceebc2eabd605d825f83978f6105dd5acd6d639676caf0b6b6 Apr 24 21:27:41.006296 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.006277 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb182b1c_327c_4054_8814_10769b9fc643.slice/crio-79b613b8b0270698df5baf8d4b0f43e762df17f9b112358af18886203993b6c1 WatchSource:0}: Error finding container 79b613b8b0270698df5baf8d4b0f43e762df17f9b112358af18886203993b6c1: Status 404 returned error can't find the container with id 79b613b8b0270698df5baf8d4b0f43e762df17f9b112358af18886203993b6c1 Apr 24 21:27:41.008123 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.008077 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4327adce_270c_40d3_b3a2_3f3c1acfa545.slice/crio-de7ef26e958348e61d39fe309b044c26146a32234af294401264ab8a8c226a7b WatchSource:0}: Error finding container de7ef26e958348e61d39fe309b044c26146a32234af294401264ab8a8c226a7b: Status 404 returned error can't find the container with id de7ef26e958348e61d39fe309b044c26146a32234af294401264ab8a8c226a7b Apr 24 21:27:41.008933 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.008834 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbc168a_6ea4_427c_8c8d_16f6a126b2a8.slice/crio-8d39246fe7452ef656ad56e8da7ab4cabdd2c061dfce7b729385554e2cfc9ec5 WatchSource:0}: Error finding container 8d39246fe7452ef656ad56e8da7ab4cabdd2c061dfce7b729385554e2cfc9ec5: Status 404 returned error can't find the container with id 8d39246fe7452ef656ad56e8da7ab4cabdd2c061dfce7b729385554e2cfc9ec5 Apr 24 21:27:41.009641 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.009598 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod155ba158_2f39_4023_b916_b8d0af483d46.slice/crio-1f663367b13d9b2b50772dc190e50cc70878ac828db16fdf0232d7c30234b4e3 WatchSource:0}: Error finding container 1f663367b13d9b2b50772dc190e50cc70878ac828db16fdf0232d7c30234b4e3: Status 404 returned error can't find the container with id 1f663367b13d9b2b50772dc190e50cc70878ac828db16fdf0232d7c30234b4e3 Apr 24 21:27:41.013069 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.013037 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd258c5_66bc_4d60_8302_4f99f9bfa7dc.slice/crio-3ee305e276d0430b0898272862d70b2120deb20cf6117558c96a7055d2fd62b4 WatchSource:0}: Error finding container 3ee305e276d0430b0898272862d70b2120deb20cf6117558c96a7055d2fd62b4: Status 404 returned error can't find the container with id 3ee305e276d0430b0898272862d70b2120deb20cf6117558c96a7055d2fd62b4 Apr 24 21:27:41.014167 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:27:41.014146 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ca30bc_f546_43f6_8751_e5c36307eb86.slice/crio-4b3907d41a1ee9082c7b86f4b504e3dc458e77bcaef76646efc860c7e8f24cbb WatchSource:0}: Error finding container 4b3907d41a1ee9082c7b86f4b504e3dc458e77bcaef76646efc860c7e8f24cbb: Status 404 returned error can't find the container with id 4b3907d41a1ee9082c7b86f4b504e3dc458e77bcaef76646efc860c7e8f24cbb Apr 24 21:27:41.068859 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.068835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:41.068974 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.068957 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:41.069028 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.069019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.069005002 +0000 UTC m=+4.008850713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:41.169329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.169300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:41.169475 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.169435 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:41.169475 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.169454 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:41.169475 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.169463 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:41.169580 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:41.169507 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.16949206 +0000 UTC m=+4.109337771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:41.498775 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.498708 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:39 +0000 UTC" deadline="2028-02-08 12:38:31.508078836 +0000 UTC" Apr 24 21:27:41.498775 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.498747 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15711h10m50.009335904s" Apr 24 21:27:41.588262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.588228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"4b3907d41a1ee9082c7b86f4b504e3dc458e77bcaef76646efc860c7e8f24cbb"} Apr 24 21:27:41.591738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.591711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerStarted","Data":"3ee305e276d0430b0898272862d70b2120deb20cf6117558c96a7055d2fd62b4"} Apr 24 21:27:41.593278 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.593235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58dwf" event={"ID":"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb","Type":"ContainerStarted","Data":"2696b1b90be9ceceebc2eabd605d825f83978f6105dd5acd6d639676caf0b6b6"} Apr 24 21:27:41.595611 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.595569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" event={"ID":"01b5838c-cf53-4f25-8edb-f0bb7176b567","Type":"ContainerStarted","Data":"bc46b256f98eeb32135a1679f4ab9e8d8f0986316af887619ec7380196330e5d"} Apr 24 21:27:41.600549 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.600518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" event={"ID":"0eb3818d29d9daefc24a29562dc700e4","Type":"ContainerStarted","Data":"20648d545979f2f1b158762e6fd0d0380782559906197091f450fbac35872231"} Apr 24 21:27:41.605237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.605213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" event={"ID":"155ba158-2f39-4023-b916-b8d0af483d46","Type":"ContainerStarted","Data":"1f663367b13d9b2b50772dc190e50cc70878ac828db16fdf0232d7c30234b4e3"} Apr 24 21:27:41.609598 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.609476 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nv6gl" event={"ID":"abbc168a-6ea4-427c-8c8d-16f6a126b2a8","Type":"ContainerStarted","Data":"8d39246fe7452ef656ad56e8da7ab4cabdd2c061dfce7b729385554e2cfc9ec5"} Apr 24 21:27:41.613170 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.613126 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flsks" event={"ID":"4327adce-270c-40d3-b3a2-3f3c1acfa545","Type":"ContainerStarted","Data":"de7ef26e958348e61d39fe309b044c26146a32234af294401264ab8a8c226a7b"} Apr 24 21:27:41.618780 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.618733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6hct6" event={"ID":"bb182b1c-327c-4054-8814-10769b9fc643","Type":"ContainerStarted","Data":"79b613b8b0270698df5baf8d4b0f43e762df17f9b112358af18886203993b6c1"} Apr 24 21:27:41.620273 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:41.620236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdppq" event={"ID":"c69c2633-a089-45fd-9a6f-5b56c0d7beb1","Type":"ContainerStarted","Data":"8d55f4a6b5fdcc9dca674f944dc5d7bd1b08b1e889f850db1a45ac8e9ea72071"} Apr 24 21:27:42.076917 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.076884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:42.077087 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.077069 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:42.077166 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.077141 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.077121657 +0000 UTC m=+6.016967374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:42.178189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.177538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:42.178189 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.177718 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:42.178189 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.177739 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:42.178189 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.177754 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:42.178189 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.177817 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.177797762 +0000 UTC m=+6.117643478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:42.579238 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.578564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:42.579238 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.578694 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:42.579238 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.579036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:42.579238 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:42.579126 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:42.636686 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.636648 2575 generic.go:358] "Generic (PLEG): container finished" podID="67300e2a61c195d7f46d25ebcf08a36d" containerID="4e311b22fc29dfaf1ed984a0b41d681a71002b89320c6c40e02ce235c82f213b" exitCode=0 Apr 24 21:27:42.637179 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.637155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" event={"ID":"67300e2a61c195d7f46d25ebcf08a36d","Type":"ContainerDied","Data":"4e311b22fc29dfaf1ed984a0b41d681a71002b89320c6c40e02ce235c82f213b"} Apr 24 21:27:42.652529 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:42.652480 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-160.ec2.internal" podStartSLOduration=3.6524627240000003 podStartE2EDuration="3.652462724s" podCreationTimestamp="2026-04-24 21:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:41.614741363 +0000 UTC m=+3.554587098" watchObservedRunningTime="2026-04-24 21:27:42.652462724 +0000 UTC m=+4.592308459" Apr 24 21:27:43.647370 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:43.647331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" event={"ID":"67300e2a61c195d7f46d25ebcf08a36d","Type":"ContainerStarted","Data":"195d73fe6f60f12edd7e513d161c9dcc86bc22b9310f0a3da23d416cbe9a0347"} Apr 24 21:27:44.092212 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:44.092107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:44.092396 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.092276 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.092396 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.092354 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:48.09233434 +0000 UTC m=+10.032180066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.192949 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:44.192612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:44.192949 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.192810 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:44.192949 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.192830 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:44.192949 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.192843 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.192949 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.192902 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:48.19288338 +0000 UTC m=+10.132729104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.578583 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:44.578178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:44.578583 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:44.578207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:44.578583 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.578308 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:44.578583 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:44.578474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:46.578904 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:46.578867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:46.579345 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:46.578974 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:46.579345 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:46.578867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:46.579463 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:46.579413 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:48.124440 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:48.124388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:48.124894 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.124604 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:48.124894 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.124669 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:56.124649336 +0000 UTC m=+18.064495052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:48.225566 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:48.225441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:48.225722 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.225601 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:48.225722 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.225628 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:48.225722 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.225642 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:48.225722 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.225707 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:56.225687025 +0000 UTC m=+18.165532737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:48.578358 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:48.578005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:48.578358 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.578131 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:48.579528 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:48.579354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:48.579528 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:48.579483 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:50.578689 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:50.578649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:50.579142 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:50.578780 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:50.579142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:50.578837 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:50.579142 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:50.578942 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:52.581011 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:52.580981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:52.581011 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:52.580991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:52.581520 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:52.581109 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:52.581520 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:52.581221 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:54.578049 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:54.578014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:54.578506 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:54.578146 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:54.578506 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:54.578175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:54.578506 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:54.578295 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:56.187157 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:56.187116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:56.187537 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.187256 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:56.187537 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.187332 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:12.187315074 +0000 UTC m=+34.127160790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:56.288076 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:56.288030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:56.288258 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.288167 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:56.288258 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.288189 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:56.288258 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.288201 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lrv9c for pod openshift-network-diagnostics/network-check-target-2sm45: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:56.288258 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.288254 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c podName:a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:12.288239509 +0000 UTC m=+34.228085220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lrv9c" (UniqueName: "kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c") pod "network-check-target-2sm45" (UID: "a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:56.578450 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:56.578356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:56.578450 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:56.578400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:56.578649 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.578516 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:56.578722 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:56.578663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:58.582898 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.582537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:27:58.583592 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:58.582981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:27:58.583592 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.582592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:27:58.583592 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:27:58.583454 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:27:58.641326 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.641111 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:58.677871 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.677840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nv6gl" event={"ID":"abbc168a-6ea4-427c-8c8d-16f6a126b2a8","Type":"ContainerStarted","Data":"0a3c84d3eff947bb0e980d82a9fa77f73bd8efa219e7568d968159710e724f4e"} Apr 24 21:27:58.679147 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.679128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flsks" event={"ID":"4327adce-270c-40d3-b3a2-3f3c1acfa545","Type":"ContainerStarted","Data":"c992d975ea6859531319e40bf15d7720420c45a4b97deaa8297e5d8d700cd938"} Apr 24 21:27:58.680255 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.680237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdppq" event={"ID":"c69c2633-a089-45fd-9a6f-5b56c0d7beb1","Type":"ContainerStarted","Data":"e59162ba5bab23a0d8431ca255f5e417172b6402b4e3153c16d0eb54b7ef9a6a"} Apr 24 21:27:58.682522 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:27:58.682800 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682782 2575 generic.go:358] "Generic (PLEG): container finished" podID="88ca30bc-f546-43f6-8751-e5c36307eb86" containerID="ad141ee5fbdbf42b400c371e5c875c32dc97fd68550552107f96823c0b1d044b" exitCode=1 Apr 24 21:27:58.682860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"73f7d8581e24f5b0a07203ddce0855f7a8a31f9cbba03269d85a81c12e4f05a8"} Apr 24 21:27:58.682860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"e905574abe12e4ef7dba9b9c2ff0ee8a718414f51ba961c790da9a05576c5738"} Apr 24 21:27:58.682860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"f523ebb3b8f57088b3c3fd57f1e6d167e20b4fbf5286d5a7cdd8b78324dca4c3"} Apr 24 21:27:58.683005 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"2be045c86f4f5906edaddc7cf9e05a0a6bbf2ab8d32721bc2eb32c4ac88cb895"} Apr 24 21:27:58.683005 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerDied","Data":"ad141ee5fbdbf42b400c371e5c875c32dc97fd68550552107f96823c0b1d044b"} Apr 24 21:27:58.683005 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.682886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"941ef5410599b0df63d77257dd006965b615825e2d620eae101fb5892c58c587"} Apr 24 21:27:58.684161 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.684139 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="c2bf18f3ad70747b0e1adc5ff1c0d0c59936c65713f742e0b8afc5a70e941a9c" exitCode=0 Apr 24 21:27:58.684258 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.684196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"c2bf18f3ad70747b0e1adc5ff1c0d0c59936c65713f742e0b8afc5a70e941a9c"} Apr 24 21:27:58.685606 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.685575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58dwf" event={"ID":"19fe13f9-cfb4-4b0f-8c65-000ccc157cbb","Type":"ContainerStarted","Data":"7e1ba3042d4ca1d8df5fc157f91a6f1dedde26784875864ef01622820a0e27cc"} Apr 24 21:27:58.687165 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.687148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" event={"ID":"01b5838c-cf53-4f25-8edb-f0bb7176b567","Type":"ContainerStarted","Data":"b84fb38d096228b21c638504ee8ed4c557445049e623144114943d3cd156ac10"} Apr 24 21:27:58.687244 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.687171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" event={"ID":"01b5838c-cf53-4f25-8edb-f0bb7176b567","Type":"ContainerStarted","Data":"404a51ba94a7baf8aa758821257987a4e7ee31c1706b0e7fd0013a199dc0e814"} Apr 24 21:27:58.688321 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.688304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" event={"ID":"155ba158-2f39-4023-b916-b8d0af483d46","Type":"ContainerStarted","Data":"170e302f1a4672ba03ba58bb130a670bb0d5b7c457578514283ac81e02ea718a"} Apr 24 21:27:58.697781 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.697726 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-160.ec2.internal" podStartSLOduration=19.697715868 podStartE2EDuration="19.697715868s" podCreationTimestamp="2026-04-24 21:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:43.662459369 +0000 UTC m=+5.602305100" watchObservedRunningTime="2026-04-24 21:27:58.697715868 +0000 UTC m=+20.637561601" Apr 24 21:27:58.697888 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.697815 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nv6gl" podStartSLOduration=3.88232029 podStartE2EDuration="20.697810397s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.010565462 +0000 UTC m=+2.950411174" lastFinishedPulling="2026-04-24 21:27:57.826055556 +0000 UTC m=+19.765901281" observedRunningTime="2026-04-24 21:27:58.697493813 +0000 UTC m=+20.637339548" watchObservedRunningTime="2026-04-24 21:27:58.697810397 +0000 UTC m=+20.637656131" Apr 24 21:27:58.720871 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.720835 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vdppq" podStartSLOduration=4.194418181 podStartE2EDuration="20.720824413s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.003750194 +0000 UTC m=+2.943595906" lastFinishedPulling="2026-04-24 21:27:57.530156413 +0000 UTC m=+19.470002138" observedRunningTime="2026-04-24 21:27:58.720680891 +0000 UTC m=+20.660526623" watchObservedRunningTime="2026-04-24 21:27:58.720824413 +0000 UTC m=+20.660670146" Apr 24 21:27:58.770856 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.770816 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-l5g4l" podStartSLOduration=4.24766998 podStartE2EDuration="20.770802943s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.011250549 +0000 UTC m=+2.951096260" lastFinishedPulling="2026-04-24 21:27:57.534383512 +0000 UTC m=+19.474229223" observedRunningTime="2026-04-24 21:27:58.770648969 +0000 UTC m=+20.710494703" watchObservedRunningTime="2026-04-24 21:27:58.770802943 +0000 UTC m=+20.710648676" Apr 24 21:27:58.786878 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.786821 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-flsks" podStartSLOduration=4.266676828 podStartE2EDuration="20.786811788s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.010276565 +0000 UTC m=+2.950122288" lastFinishedPulling="2026-04-24 21:27:57.530411524 +0000 UTC m=+19.470257248" observedRunningTime="2026-04-24 21:27:58.786664632 +0000 UTC m=+20.726510365" watchObservedRunningTime="2026-04-24 21:27:58.786811788 +0000 UTC m=+20.726657521" Apr 24 21:27:58.802476 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:58.802446 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-58dwf" podStartSLOduration=12.215177903 podStartE2EDuration="20.802415574s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.006508816 +0000 UTC m=+2.946354541" lastFinishedPulling="2026-04-24 21:27:49.593746482 +0000 UTC m=+11.533592212" observedRunningTime="2026-04-24 21:27:58.80227629 +0000 UTC m=+20.742122013" watchObservedRunningTime="2026-04-24 21:27:58.802415574 +0000 UTC m=+20.742261303" Apr 24 21:27:59.520434 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.520333 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:58.641324584Z","UUID":"aaacbaed-5b46-455b-bff8-1013c209ebc1","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:59.523330 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.523308 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:59.523451 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.523337 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:59.693720 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.693686 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6hct6" event={"ID":"bb182b1c-327c-4054-8814-10769b9fc643","Type":"ContainerStarted","Data":"78a52aab0b9f969ed76b59c34809271cced678b7eadc88e251ee71ab0d1f6818"} Apr 24 21:27:59.695768 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.695712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" event={"ID":"01b5838c-cf53-4f25-8edb-f0bb7176b567","Type":"ContainerStarted","Data":"70425f35ac0a20663d5e9d6e7c452a65abb7ef89276441461591e952873bcab2"} Apr 24 21:27:59.710674 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.710637 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6hct6" podStartSLOduration=5.201101425 podStartE2EDuration="21.710627782s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.008241681 +0000 UTC m=+2.948087395" lastFinishedPulling="2026-04-24 21:27:57.517768027 +0000 UTC m=+19.457613752" observedRunningTime="2026-04-24 21:27:59.710479088 +0000 UTC m=+21.650324821" watchObservedRunningTime="2026-04-24 21:27:59.710627782 +0000 UTC m=+21.650473515" Apr 24 21:27:59.730214 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:27:59.730170 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gbl5r" podStartSLOduration=3.453192696 podStartE2EDuration="21.730159358s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.005222997 +0000 UTC m=+2.945068708" lastFinishedPulling="2026-04-24 21:27:59.282189659 +0000 UTC m=+21.222035370" observedRunningTime="2026-04-24 21:27:59.72998574 +0000 UTC m=+21.669831472" watchObservedRunningTime="2026-04-24 21:27:59.730159358 +0000 UTC m=+21.670005090" Apr 24 21:28:00.408153 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.408067 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:28:00.408766 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.408742 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:28:00.577834 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.577807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:00.578010 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.577807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:00.578010 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:00.577919 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:00.578010 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:00.577993 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:00.701139 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.701114 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:28:00.701799 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:00.701486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"e284a89fd8a720af1aeadf8bccbde1286e074b8aa75728190896eb12a720c7b5"} Apr 24 21:28:01.703369 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:01.703340 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:28:02.578652 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:02.578580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:02.578814 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:02.578580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:02.578814 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:02.578707 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:02.578814 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:02.578757 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:03.710668 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.710509 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:28:03.711074 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.710981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"6806395f8dccb17c983c26981dea1c5969e5aff2c3df428db2e78cf246f75455"} Apr 24 21:28:03.711276 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.711258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:03.711339 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.711284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:03.711408 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.711386 2575 scope.go:117] "RemoveContainer" containerID="ad141ee5fbdbf42b400c371e5c875c32dc97fd68550552107f96823c0b1d044b" Apr 24 21:28:03.712840 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.712818 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="783eff8a8c29f587ff2f5542c44061ada45e601197339d80ea816b33956629dc" exitCode=0 Apr 24 21:28:03.712922 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.712857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"783eff8a8c29f587ff2f5542c44061ada45e601197339d80ea816b33956629dc"} Apr 24 21:28:03.726344 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:03.726325 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:04.578280 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.578132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:04.578366 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.578132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:04.578366 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:04.578322 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:04.578450 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:04.578390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:04.667715 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.667689 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l78bh"] Apr 24 21:28:04.670231 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.670206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2sm45"] Apr 24 21:28:04.718046 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.718027 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:28:04.718550 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.718320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" event={"ID":"88ca30bc-f546-43f6-8751-e5c36307eb86","Type":"ContainerStarted","Data":"f5b2b8bbc771fd6777cd6553e6789a7a57b9ce749c541ce6975836fc505a96ab"} Apr 24 21:28:04.718725 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.718707 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:04.720308 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.720287 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="d859c492ca42b3ffc067fd0a757fb94c38cdc170cf24ecf6949eda17481bbc67" exitCode=0 Apr 24 21:28:04.720435 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.720360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:04.720435 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.720375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"d859c492ca42b3ffc067fd0a757fb94c38cdc170cf24ecf6949eda17481bbc67"} Apr 24 21:28:04.720503 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:04.720450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:04.720551 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.720530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:04.720635 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:04.720618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:04.732435 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.732406 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:04.752274 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:04.752240 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" podStartSLOduration=10.183575523 podStartE2EDuration="26.752230551s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.01596141 +0000 UTC m=+2.955807124" lastFinishedPulling="2026-04-24 21:27:57.584616433 +0000 UTC m=+19.524462152" observedRunningTime="2026-04-24 21:28:04.751570254 +0000 UTC m=+26.691415988" watchObservedRunningTime="2026-04-24 21:28:04.752230551 +0000 UTC m=+26.692076284" Apr 24 21:28:05.723528 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:05.723501 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="b31b18394b197c52f08a3987eea196c72bbf01ef27e3c55e2e2fedfeeb2f507c" exitCode=0 Apr 24 21:28:05.723953 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:05.723584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"b31b18394b197c52f08a3987eea196c72bbf01ef27e3c55e2e2fedfeeb2f507c"} Apr 24 21:28:06.578452 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:06.578390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:06.578452 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:06.578449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:06.578633 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:06.578531 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:06.578775 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:06.578678 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:08.325293 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:08.325263 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:28:08.325853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:08.325402 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:28:08.326001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:08.325984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-58dwf" Apr 24 21:28:08.579992 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:08.579918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:08.580116 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:08.580012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:08.580116 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:08.580041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm45" podUID="a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f" Apr 24 21:28:08.580116 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:08.580094 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:28:10.430047 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.429844 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-160.ec2.internal" event="NodeReady" Apr 24 21:28:10.430467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.430143 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:10.481470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.481378 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8m598"] Apr 24 21:28:10.503573 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.503547 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pzzlw"] Apr 24 21:28:10.503733 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.503705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.506232 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.506200 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:10.506329 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.506229 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:28:10.506706 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.506531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:10.519810 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.519792 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzzlw"] Apr 24 21:28:10.519810 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.519813 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8m598"] Apr 24 21:28:10.519974 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.519896 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:10.524985 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.524965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:10.525196 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.525175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:10.525324 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.525272 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:28:10.525324 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.525280 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:10.578570 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.578549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:10.578676 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.578576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:10.583453 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.583414 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sctkp\"" Apr 24 21:28:10.583558 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.583451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:28:10.583615 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.583572 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:10.583615 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.583451 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:10.583708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.583681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:10.595358 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.595338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-config-volume\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.595459 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.595388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.595459 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.595415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47tvb\" (UniqueName: \"kubernetes.io/projected/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-kube-api-access-47tvb\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.595545 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.595475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-tmp-dir\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696103 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47tvb\" (UniqueName: \"kubernetes.io/projected/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-kube-api-access-47tvb\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696229 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-tmp-dir\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696229 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9twq\" (UniqueName: \"kubernetes.io/projected/51fc9513-bf57-4b5f-9a7c-f7325f046b26-kube-api-access-d9twq\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:10.696229 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-config-volume\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696376 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696376 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:10.696502 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:10.696487 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:10.696554 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:10.696548 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:11.196532436 +0000 UTC m=+33.136378147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:10.696609 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-tmp-dir\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.696860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.696839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-config-volume\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.708357 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.708339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47tvb\" (UniqueName: \"kubernetes.io/projected/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-kube-api-access-47tvb\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:10.796657 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.796594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:10.796777 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.796662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9twq\" (UniqueName: \"kubernetes.io/projected/51fc9513-bf57-4b5f-9a7c-f7325f046b26-kube-api-access-d9twq\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:10.796777 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:10.796750 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:10.796862 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:10.796822 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:11.296800365 +0000 UTC m=+33.236646094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:10.806497 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:10.806477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9twq\" (UniqueName: \"kubernetes.io/projected/51fc9513-bf57-4b5f-9a7c-f7325f046b26-kube-api-access-d9twq\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:11.199361 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:11.199055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:11.199361 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:11.199221 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:11.199361 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:11.199295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:12.199274965 +0000 UTC m=+34.139120682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:11.299717 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:11.299691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:11.299861 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:11.299800 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:11.299861 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:11.299857 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:12.299841311 +0000 UTC m=+34.239687037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:11.736585 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:11.736523 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="bdb90d5aad1186a79d5f1f5a62de7fed2284f43738e9c3a26cac058f6cf79b32" exitCode=0 Apr 24 21:28:11.736585 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:11.736569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"bdb90d5aad1186a79d5f1f5a62de7fed2284f43738e9c3a26cac058f6cf79b32"} Apr 24 21:28:12.205376 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.205351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:12.205555 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.205401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:12.205555 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.205511 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:12.205621 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.205511 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:12.205621 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.205602 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:14.205582546 +0000 UTC m=+36.145428261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:12.205689 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.205632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:44.205621575 +0000 UTC m=+66.145467291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : secret "metrics-daemon-secret" not found Apr 24 21:28:12.306523 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.306498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:12.306645 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.306541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:12.306774 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.306758 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:12.306820 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:12.306811 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:14.306797966 +0000 UTC m=+36.246643677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:12.309283 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.309261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrv9c\" (UniqueName: \"kubernetes.io/projected/a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f-kube-api-access-lrv9c\") pod \"network-check-target-2sm45\" (UID: \"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f\") " pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:12.395347 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.395328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:12.558031 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.558003 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2sm45"] Apr 24 21:28:12.561773 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:28:12.561749 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e1bc5e_3bc3_4e15_a162_3e3b6e59374f.slice/crio-3dbc5a34762fa8b3a4b90ccab6615d0127cc099f41da3e224981c253ba4a5f32 WatchSource:0}: Error finding container 3dbc5a34762fa8b3a4b90ccab6615d0127cc099f41da3e224981c253ba4a5f32: Status 404 returned error can't find the container with id 3dbc5a34762fa8b3a4b90ccab6615d0127cc099f41da3e224981c253ba4a5f32 Apr 24 21:28:12.739292 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.739223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2sm45" event={"ID":"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f","Type":"ContainerStarted","Data":"3dbc5a34762fa8b3a4b90ccab6615d0127cc099f41da3e224981c253ba4a5f32"} Apr 24 21:28:12.741635 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.741606 2575 generic.go:358] "Generic (PLEG): container finished" podID="edd258c5-66bc-4d60-8302-4f99f9bfa7dc" containerID="565ce5df36c1c930c8ad748ffee17ee787f1158c4abc5d3a773bb233b2839b80" exitCode=0 Apr 24 21:28:12.741737 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:12.741646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerDied","Data":"565ce5df36c1c930c8ad748ffee17ee787f1158c4abc5d3a773bb233b2839b80"} Apr 24 21:28:13.747273 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:13.747089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" event={"ID":"edd258c5-66bc-4d60-8302-4f99f9bfa7dc","Type":"ContainerStarted","Data":"eee3485b848a812034c6eaca12d8c20eeab509e9ec84f920d29826d9f4119fef"} Apr 24 21:28:13.781510 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:13.781470 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kb7pn" podStartSLOduration=5.465225231 podStartE2EDuration="35.781452595s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:41.014607065 +0000 UTC m=+2.954452776" lastFinishedPulling="2026-04-24 21:28:11.330834425 +0000 UTC m=+33.270680140" observedRunningTime="2026-04-24 21:28:13.781320079 +0000 UTC m=+35.721165813" watchObservedRunningTime="2026-04-24 21:28:13.781452595 +0000 UTC m=+35.721298329" Apr 24 21:28:14.222352 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:14.222314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:14.222524 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:14.222497 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:14.222585 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:14.222567 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.222552889 +0000 UTC m=+40.162398600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:14.323514 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:14.323479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:14.323685 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:14.323648 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:14.323749 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:14.323716 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.323695014 +0000 UTC m=+40.263540728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:15.752098 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:15.752064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2sm45" event={"ID":"a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f","Type":"ContainerStarted","Data":"75581dc38838bb6cb4d1708fb6cc2bd1bcb72d3b9220f2e27bb5d5bd77f9c9b2"} Apr 24 21:28:15.752695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:15.752184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:28:15.774628 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:15.774587 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2sm45" podStartSLOduration=35.032305619 podStartE2EDuration="37.774575852s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:28:12.563482683 +0000 UTC m=+34.503328394" lastFinishedPulling="2026-04-24 21:28:15.305752916 +0000 UTC m=+37.245598627" observedRunningTime="2026-04-24 21:28:15.774572858 +0000 UTC m=+37.714418590" watchObservedRunningTime="2026-04-24 21:28:15.774575852 +0000 UTC m=+37.714421584" Apr 24 21:28:18.249777 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:18.249745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:18.250223 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:18.249887 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:18.250223 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:18.249950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:26.249935037 +0000 UTC m=+48.189780748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:18.350232 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:18.350212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:18.350308 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:18.350294 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:18.350341 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:18.350337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:26.350324855 +0000 UTC m=+48.290170567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:26.303069 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:26.303039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:26.303531 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:26.303171 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:26.303531 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:26.303238 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:42.303223392 +0000 UTC m=+64.243069103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:26.403974 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:26.403944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:26.404067 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:26.404043 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:26.404108 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:26.404088 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:42.404076512 +0000 UTC m=+64.343922224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:36.735273 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:36.735247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglkz" Apr 24 21:28:42.401335 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:42.401302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:28:42.401808 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:42.401403 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:42.401808 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:42.401474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:14.401460696 +0000 UTC m=+96.341306407 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:28:42.502287 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:42.502255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:28:42.502398 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:42.502346 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:42.502398 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:42.502391 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:14.502379756 +0000 UTC m=+96.442225467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:28:44.214511 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:44.214481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:28:44.214880 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:44.214595 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:44.214880 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:28:44.214646 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:48.214633386 +0000 UTC m=+130.154479097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : secret "metrics-daemon-secret" not found Apr 24 21:28:46.756245 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:28:46.756213 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2sm45" Apr 24 21:29:14.403447 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:14.403398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:29:14.403809 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:14.403547 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:14.403809 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:14.403609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls podName:4c8553a4-97bd-43aa-a9ab-7ccbb4358a98 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:18.403594898 +0000 UTC m=+160.343440609 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls") pod "dns-default-8m598" (UID: "4c8553a4-97bd-43aa-a9ab-7ccbb4358a98") : secret "dns-default-metrics-tls" not found Apr 24 21:29:14.504496 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:14.504435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:29:14.504595 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:14.504528 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:14.504595 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:14.504567 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert podName:51fc9513-bf57-4b5f-9a7c-f7325f046b26 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:18.504556795 +0000 UTC m=+160.444402506 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert") pod "ingress-canary-pzzlw" (UID: "51fc9513-bf57-4b5f-9a7c-f7325f046b26") : secret "canary-serving-cert" not found Apr 24 21:29:35.975964 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.975932 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs"] Apr 24 21:29:35.978645 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.978628 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" Apr 24 21:29:35.981571 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.981547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9ffg6\"" Apr 24 21:29:35.981986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.981971 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:35.983906 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.983890 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:35.997825 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:35.997798 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs"] Apr 24 21:29:36.039752 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.039720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxl66\" (UniqueName: \"kubernetes.io/projected/0c63e97e-44fc-421c-8de0-988acb06e78e-kube-api-access-rxl66\") pod \"volume-data-source-validator-7c6cbb6c87-xl4xs\" (UID: \"0c63e97e-44fc-421c-8de0-988acb06e78e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" Apr 24 21:29:36.090205 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.090185 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l5zd6"] Apr 24 21:29:36.092699 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.092687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.095588 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.095564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:29:36.095682 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.095586 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:29:36.096793 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.096774 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:36.097108 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.097091 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:36.097231 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.097214 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-8nvhh\"" Apr 24 21:29:36.103350 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.103330 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:29:36.115940 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.115921 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l5zd6"] Apr 24 21:29:36.140097 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxl66\" (UniqueName: \"kubernetes.io/projected/0c63e97e-44fc-421c-8de0-988acb06e78e-kube-api-access-rxl66\") pod \"volume-data-source-validator-7c6cbb6c87-xl4xs\" (UID: \"0c63e97e-44fc-421c-8de0-988acb06e78e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" Apr 24 21:29:36.140190 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-tmp\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.140190 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpq2l\" (UniqueName: \"kubernetes.io/projected/114bfe15-0df7-402e-b377-0bf72321706b-kube-api-access-kpq2l\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.140293 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-service-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.140293 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114bfe15-0df7-402e-b377-0bf72321706b-serving-cert\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.140396 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-snapshots\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.140396 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.140345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.150695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.150677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxl66\" (UniqueName: \"kubernetes.io/projected/0c63e97e-44fc-421c-8de0-988acb06e78e-kube-api-access-rxl66\") pod \"volume-data-source-validator-7c6cbb6c87-xl4xs\" (UID: \"0c63e97e-44fc-421c-8de0-988acb06e78e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" Apr 24 21:29:36.240612 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-tmp\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.240612 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpq2l\" (UniqueName: \"kubernetes.io/projected/114bfe15-0df7-402e-b377-0bf72321706b-kube-api-access-kpq2l\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.240612 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-service-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.240746 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114bfe15-0df7-402e-b377-0bf72321706b-serving-cert\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.240746 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-snapshots\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.240746 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.240682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.241135 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.241109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-service-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.241251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.241165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-tmp\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.241251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.241217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/114bfe15-0df7-402e-b377-0bf72321706b-snapshots\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.241585 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.241566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114bfe15-0df7-402e-b377-0bf72321706b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.242671 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.242656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114bfe15-0df7-402e-b377-0bf72321706b-serving-cert\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.248926 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.248909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpq2l\" (UniqueName: \"kubernetes.io/projected/114bfe15-0df7-402e-b377-0bf72321706b-kube-api-access-kpq2l\") pod \"insights-operator-585dfdc468-l5zd6\" (UID: \"114bfe15-0df7-402e-b377-0bf72321706b\") " pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.286672 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.286650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" Apr 24 21:29:36.400997 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.400970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" Apr 24 21:29:36.410933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.410909 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs"] Apr 24 21:29:36.413845 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:36.413821 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c63e97e_44fc_421c_8de0_988acb06e78e.slice/crio-eeb45446761d525a8c12bcebb5e6f9ed9764241389be609bda12ed84c9acf8a8 WatchSource:0}: Error finding container eeb45446761d525a8c12bcebb5e6f9ed9764241389be609bda12ed84c9acf8a8: Status 404 returned error can't find the container with id eeb45446761d525a8c12bcebb5e6f9ed9764241389be609bda12ed84c9acf8a8 Apr 24 21:29:36.507317 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.507253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l5zd6"] Apr 24 21:29:36.509972 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:36.509947 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114bfe15_0df7_402e_b377_0bf72321706b.slice/crio-e0c749a4e5999160caa736b7451777221fb5d9a1f4e7c8113acaef5b0d4ed196 WatchSource:0}: Error finding container e0c749a4e5999160caa736b7451777221fb5d9a1f4e7c8113acaef5b0d4ed196: Status 404 returned error can't find the container with id e0c749a4e5999160caa736b7451777221fb5d9a1f4e7c8113acaef5b0d4ed196 Apr 24 21:29:36.895245 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.895175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" event={"ID":"0c63e97e-44fc-421c-8de0-988acb06e78e","Type":"ContainerStarted","Data":"eeb45446761d525a8c12bcebb5e6f9ed9764241389be609bda12ed84c9acf8a8"} Apr 24 21:29:36.896204 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:36.896179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" event={"ID":"114bfe15-0df7-402e-b377-0bf72321706b","Type":"ContainerStarted","Data":"e0c749a4e5999160caa736b7451777221fb5d9a1f4e7c8113acaef5b0d4ed196"} Apr 24 21:29:38.901316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:38.901279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" event={"ID":"114bfe15-0df7-402e-b377-0bf72321706b","Type":"ContainerStarted","Data":"690139a763246477579fc4b2213796275b6991744d35c34392f3a40022fc22d9"} Apr 24 21:29:38.902658 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:38.902634 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" event={"ID":"0c63e97e-44fc-421c-8de0-988acb06e78e","Type":"ContainerStarted","Data":"3ad9c2ab9803350dc7c1084e8cd75a1318587218e8c23a396f7cec70cff5c09e"} Apr 24 21:29:38.922275 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:38.922236 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" podStartSLOduration=1.283178387 podStartE2EDuration="2.922223387s" podCreationTimestamp="2026-04-24 21:29:36 +0000 UTC" firstStartedPulling="2026-04-24 21:29:36.511730834 +0000 UTC m=+118.451576546" lastFinishedPulling="2026-04-24 21:29:38.150775822 +0000 UTC m=+120.090621546" observedRunningTime="2026-04-24 21:29:38.921203812 +0000 UTC m=+120.861049567" watchObservedRunningTime="2026-04-24 21:29:38.922223387 +0000 UTC m=+120.862069140" Apr 24 21:29:38.940218 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:38.940176 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xl4xs" podStartSLOduration=2.209057364 podStartE2EDuration="3.940164671s" podCreationTimestamp="2026-04-24 21:29:35 +0000 UTC" firstStartedPulling="2026-04-24 21:29:36.415803928 +0000 UTC m=+118.355649642" lastFinishedPulling="2026-04-24 21:29:38.146911231 +0000 UTC m=+120.086756949" observedRunningTime="2026-04-24 21:29:38.939158 +0000 UTC m=+120.879003744" watchObservedRunningTime="2026-04-24 21:29:38.940164671 +0000 UTC m=+120.880010403" Apr 24 21:29:41.538835 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:41.538807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vdppq_c69c2633-a089-45fd-9a6f-5b56c0d7beb1/dns-node-resolver/0.log" Apr 24 21:29:42.138727 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.138696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flsks_4327adce-270c-40d3-b3a2-3f3c1acfa545/node-ca/0.log" Apr 24 21:29:42.859259 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.859211 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:29:42.863115 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.863093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.866155 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.866135 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:29:42.866384 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.866370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:29:42.866632 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.866615 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6rrzj\"" Apr 24 21:29:42.866697 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.866664 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:29:42.871896 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.871879 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:29:42.874107 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.874089 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:29:42.992646 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992782 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992782 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992782 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992782 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qtb\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992912 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992912 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:42.992912 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:42.992894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093511 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qtb\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093659 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.093642 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:43.093935 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.093663 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f95988bbc-b6txh: secret "image-registry-tls" not found Apr 24 21:29:43.093935 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.093738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls podName:5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:43.593715613 +0000 UTC m=+125.533561325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls") pod "image-registry-5f95988bbc-b6txh" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a") : secret "image-registry-tls" not found Apr 24 21:29:43.093935 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.093935 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.093865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.094575 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.094556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.094738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.094717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.094892 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.094874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.096101 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.096081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.096507 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.096489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.102481 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.102462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qtb\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.105206 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.105186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.597263 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.597232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:43.597446 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.597342 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:43.597446 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.597353 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f95988bbc-b6txh: secret "image-registry-tls" not found Apr 24 21:29:43.597446 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:43.597409 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls podName:5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:44.59739451 +0000 UTC m=+126.537240222 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls") pod "image-registry-5f95988bbc-b6txh" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a") : secret "image-registry-tls" not found Apr 24 21:29:43.791153 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.791122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw"] Apr 24 21:29:43.795191 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.795175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.797739 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.797721 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:29:43.798282 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.798259 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-nqlmh\"" Apr 24 21:29:43.798891 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.798869 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:43.798962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.798877 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:43.799138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.799125 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:29:43.804075 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.804058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw"] Apr 24 21:29:43.898833 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.898778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.898833 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.898805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk2j9\" (UniqueName: \"kubernetes.io/projected/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-kube-api-access-zk2j9\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.898833 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.898831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.999356 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.999336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.999487 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.999362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk2j9\" (UniqueName: \"kubernetes.io/projected/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-kube-api-access-zk2j9\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.999487 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.999387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:43.999826 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:43.999808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:44.001511 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.001489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:44.007986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.007962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk2j9\" (UniqueName: \"kubernetes.io/projected/d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f-kube-api-access-zk2j9\") pod \"kube-storage-version-migrator-operator-6769c5d45-szrrw\" (UID: \"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:44.104330 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.104307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" Apr 24 21:29:44.212971 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.212943 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw"] Apr 24 21:29:44.215825 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:44.215796 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51e6e0b_2cb3_4dcb_94ad_d4c80e19405f.slice/crio-0e38868a0d8fc22b0ab272ece773bdd351441f0f7a3cdbde6c1eb077ece221bb WatchSource:0}: Error finding container 0e38868a0d8fc22b0ab272ece773bdd351441f0f7a3cdbde6c1eb077ece221bb: Status 404 returned error can't find the container with id 0e38868a0d8fc22b0ab272ece773bdd351441f0f7a3cdbde6c1eb077ece221bb Apr 24 21:29:44.604622 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.604587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:44.604813 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:44.604702 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:44.604813 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:44.604718 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f95988bbc-b6txh: secret "image-registry-tls" not found Apr 24 21:29:44.604813 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:44.604788 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls podName:5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:46.604769714 +0000 UTC m=+128.544615443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls") pod "image-registry-5f95988bbc-b6txh" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a") : secret "image-registry-tls" not found Apr 24 21:29:44.914766 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:44.914682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" event={"ID":"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f","Type":"ContainerStarted","Data":"0e38868a0d8fc22b0ab272ece773bdd351441f0f7a3cdbde6c1eb077ece221bb"} Apr 24 21:29:45.829223 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.829194 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5"] Apr 24 21:29:45.831854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.831839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:45.834234 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.834217 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:29:45.834333 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.834239 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:45.834519 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.834502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:29:45.835404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.835388 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:45.835483 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.835388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-m6nrs\"" Apr 24 21:29:45.840047 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.840022 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5"] Apr 24 21:29:45.914026 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.914000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e9ba745-815f-4019-a172-e88557fff65c-config\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:45.914159 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.914091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqq9s\" (UniqueName: \"kubernetes.io/projected/2e9ba745-815f-4019-a172-e88557fff65c-kube-api-access-tqq9s\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:45.914159 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.914124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e9ba745-815f-4019-a172-e88557fff65c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:45.917569 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:45.917544 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" event={"ID":"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f","Type":"ContainerStarted","Data":"9db1c70184c8c63d90359a59cfdd90b3293280e02148c74f181877190a0d0b25"} Apr 24 21:29:46.015195 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.015122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqq9s\" (UniqueName: \"kubernetes.io/projected/2e9ba745-815f-4019-a172-e88557fff65c-kube-api-access-tqq9s\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.015457 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.015433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e9ba745-815f-4019-a172-e88557fff65c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.015544 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.015476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e9ba745-815f-4019-a172-e88557fff65c-config\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.016006 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.015983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e9ba745-815f-4019-a172-e88557fff65c-config\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.017516 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.017499 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e9ba745-815f-4019-a172-e88557fff65c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.027765 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.027744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqq9s\" (UniqueName: \"kubernetes.io/projected/2e9ba745-815f-4019-a172-e88557fff65c-kube-api-access-tqq9s\") pod \"service-ca-operator-d6fc45fc5-fbtc5\" (UID: \"2e9ba745-815f-4019-a172-e88557fff65c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.141039 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.141014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" Apr 24 21:29:46.253809 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.253752 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" podStartSLOduration=1.7457681379999999 podStartE2EDuration="3.253731357s" podCreationTimestamp="2026-04-24 21:29:43 +0000 UTC" firstStartedPulling="2026-04-24 21:29:44.217639177 +0000 UTC m=+126.157484892" lastFinishedPulling="2026-04-24 21:29:45.725602399 +0000 UTC m=+127.665448111" observedRunningTime="2026-04-24 21:29:45.93601114 +0000 UTC m=+127.875856873" watchObservedRunningTime="2026-04-24 21:29:46.253731357 +0000 UTC m=+128.193577093" Apr 24 21:29:46.254242 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.254204 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5"] Apr 24 21:29:46.256605 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:46.256576 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9ba745_815f_4019_a172_e88557fff65c.slice/crio-7fefd5ee52031d8ee8ae7117a668178fb982b5531a0a861eff3a1f56ebb6eaf0 WatchSource:0}: Error finding container 7fefd5ee52031d8ee8ae7117a668178fb982b5531a0a861eff3a1f56ebb6eaf0: Status 404 returned error can't find the container with id 7fefd5ee52031d8ee8ae7117a668178fb982b5531a0a861eff3a1f56ebb6eaf0 Apr 24 21:29:46.621134 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.621106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:46.621305 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:46.621213 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:46.621305 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:46.621227 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f95988bbc-b6txh: secret "image-registry-tls" not found Apr 24 21:29:46.621305 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:46.621275 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls podName:5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:50.621261765 +0000 UTC m=+132.561107476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls") pod "image-registry-5f95988bbc-b6txh" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a") : secret "image-registry-tls" not found Apr 24 21:29:46.920858 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:46.920757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" event={"ID":"2e9ba745-815f-4019-a172-e88557fff65c","Type":"ContainerStarted","Data":"7fefd5ee52031d8ee8ae7117a668178fb982b5531a0a861eff3a1f56ebb6eaf0"} Apr 24 21:29:47.749755 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.749722 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl"] Apr 24 21:29:47.753018 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.752993 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" Apr 24 21:29:47.756373 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.756340 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-t58sv\"" Apr 24 21:29:47.761751 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.761729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl"] Apr 24 21:29:47.831409 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.831373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8kd\" (UniqueName: \"kubernetes.io/projected/bdffaacb-7941-4781-a486-7ed533d52846-kube-api-access-ff8kd\") pod \"network-check-source-8894fc9bd-c2lfl\" (UID: \"bdffaacb-7941-4781-a486-7ed533d52846\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" Apr 24 21:29:47.932327 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.932290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8kd\" (UniqueName: \"kubernetes.io/projected/bdffaacb-7941-4781-a486-7ed533d52846-kube-api-access-ff8kd\") pod \"network-check-source-8894fc9bd-c2lfl\" (UID: \"bdffaacb-7941-4781-a486-7ed533d52846\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" Apr 24 21:29:47.942589 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:47.942552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8kd\" (UniqueName: \"kubernetes.io/projected/bdffaacb-7941-4781-a486-7ed533d52846-kube-api-access-ff8kd\") pod \"network-check-source-8894fc9bd-c2lfl\" (UID: \"bdffaacb-7941-4781-a486-7ed533d52846\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" Apr 24 21:29:48.064023 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.064004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" Apr 24 21:29:48.178714 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.178679 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl"] Apr 24 21:29:48.181294 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:48.181265 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdffaacb_7941_4781_a486_7ed533d52846.slice/crio-13a609b7e02f9a65e2d1d3bd0bd6272c6d962b1d095a95be18b059e11e59aee0 WatchSource:0}: Error finding container 13a609b7e02f9a65e2d1d3bd0bd6272c6d962b1d095a95be18b059e11e59aee0: Status 404 returned error can't find the container with id 13a609b7e02f9a65e2d1d3bd0bd6272c6d962b1d095a95be18b059e11e59aee0 Apr 24 21:29:48.234604 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.234579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:29:48.234846 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:48.234729 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:48.234846 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:48.234791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs podName:6ec8ce7f-d73f-4ff5-a981-9d84448a51a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:50.234772108 +0000 UTC m=+252.174617824 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs") pod "network-metrics-daemon-l78bh" (UID: "6ec8ce7f-d73f-4ff5-a981-9d84448a51a6") : secret "metrics-daemon-secret" not found Apr 24 21:29:48.926832 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.926796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" event={"ID":"2e9ba745-815f-4019-a172-e88557fff65c","Type":"ContainerStarted","Data":"2bccd1a6728fa83e818ca57c809d6f6e7055901a25dd0048346e875b525691fd"} Apr 24 21:29:48.928166 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.928138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" event={"ID":"bdffaacb-7941-4781-a486-7ed533d52846","Type":"ContainerStarted","Data":"d92d7435f36fae87f319efe4f81c644ee21057153a491cdc6693b304601d8c97"} Apr 24 21:29:48.928166 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.928165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" event={"ID":"bdffaacb-7941-4781-a486-7ed533d52846","Type":"ContainerStarted","Data":"13a609b7e02f9a65e2d1d3bd0bd6272c6d962b1d095a95be18b059e11e59aee0"} Apr 24 21:29:48.943331 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.943286 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" podStartSLOduration=2.14404655 podStartE2EDuration="3.943272938s" podCreationTimestamp="2026-04-24 21:29:45 +0000 UTC" firstStartedPulling="2026-04-24 21:29:46.258352656 +0000 UTC m=+128.198198371" lastFinishedPulling="2026-04-24 21:29:48.05757904 +0000 UTC m=+129.997424759" observedRunningTime="2026-04-24 21:29:48.942896813 +0000 UTC m=+130.882742548" watchObservedRunningTime="2026-04-24 21:29:48.943272938 +0000 UTC m=+130.883118671" Apr 24 21:29:48.957848 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:48.957799 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c2lfl" podStartSLOduration=1.957788978 podStartE2EDuration="1.957788978s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:48.957748297 +0000 UTC m=+130.897594030" watchObservedRunningTime="2026-04-24 21:29:48.957788978 +0000 UTC m=+130.897634714" Apr 24 21:29:50.654373 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:50.654341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:50.654753 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:50.654514 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:50.654753 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:50.654532 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f95988bbc-b6txh: secret "image-registry-tls" not found Apr 24 21:29:50.654753 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:29:50.654591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls podName:5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:58.654573627 +0000 UTC m=+140.594419340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls") pod "image-registry-5f95988bbc-b6txh" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a") : secret "image-registry-tls" not found Apr 24 21:29:51.857719 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.857684 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtfx2"] Apr 24 21:29:51.860890 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.860872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:51.864465 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.864440 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:29:51.864626 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.864605 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xhp9x\"" Apr 24 21:29:51.865516 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.865500 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:29:51.865567 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.865502 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:29:51.865567 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.865542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:29:51.875611 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.875590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtfx2"] Apr 24 21:29:51.962180 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.962161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-key\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:51.962267 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.962191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-cabundle\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:51.962267 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:51.962227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zgjm\" (UniqueName: \"kubernetes.io/projected/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-kube-api-access-8zgjm\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.063499 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.063473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zgjm\" (UniqueName: \"kubernetes.io/projected/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-kube-api-access-8zgjm\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.063577 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.063535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-key\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.063577 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.063557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-cabundle\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.064126 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.064106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-cabundle\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.065791 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.065771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-signing-key\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.072999 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.072978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zgjm\" (UniqueName: \"kubernetes.io/projected/b3f7a4ce-1dd3-4768-89a3-e35106a565cf-kube-api-access-8zgjm\") pod \"service-ca-865cb79987-jtfx2\" (UID: \"b3f7a4ce-1dd3-4768-89a3-e35106a565cf\") " pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.170274 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.170219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jtfx2" Apr 24 21:29:52.304053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.304024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jtfx2"] Apr 24 21:29:52.307285 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:52.307243 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3f7a4ce_1dd3_4768_89a3_e35106a565cf.slice/crio-8ec9b26d0afaa86212f4acf87bc9830c0e71bff387d8c182f72abf2bab5fd540 WatchSource:0}: Error finding container 8ec9b26d0afaa86212f4acf87bc9830c0e71bff387d8c182f72abf2bab5fd540: Status 404 returned error can't find the container with id 8ec9b26d0afaa86212f4acf87bc9830c0e71bff387d8c182f72abf2bab5fd540 Apr 24 21:29:52.939770 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.939734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jtfx2" event={"ID":"b3f7a4ce-1dd3-4768-89a3-e35106a565cf","Type":"ContainerStarted","Data":"a1b77fd034baf425961c8b99fd0ed4df0e1d89d8e1779120fbe32188f42c7240"} Apr 24 21:29:52.939770 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.939772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jtfx2" event={"ID":"b3f7a4ce-1dd3-4768-89a3-e35106a565cf","Type":"ContainerStarted","Data":"8ec9b26d0afaa86212f4acf87bc9830c0e71bff387d8c182f72abf2bab5fd540"} Apr 24 21:29:52.958178 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:52.958137 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jtfx2" podStartSLOduration=1.958124705 podStartE2EDuration="1.958124705s" podCreationTimestamp="2026-04-24 21:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:52.956813336 +0000 UTC m=+134.896659074" watchObservedRunningTime="2026-04-24 21:29:52.958124705 +0000 UTC m=+134.897970476" Apr 24 21:29:58.711684 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.711643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:58.713921 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.713896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"image-registry-5f95988bbc-b6txh\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:58.772228 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.772206 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:58.891821 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.891729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:29:58.894307 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:29:58.894265 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb8c399_f9a0_4de2_bd75_f1bbf42ac96a.slice/crio-ddc7b0ea465d2e3e49e1f2940b967649246ee820becf932596d91dae7bb478f5 WatchSource:0}: Error finding container ddc7b0ea465d2e3e49e1f2940b967649246ee820becf932596d91dae7bb478f5: Status 404 returned error can't find the container with id ddc7b0ea465d2e3e49e1f2940b967649246ee820becf932596d91dae7bb478f5 Apr 24 21:29:58.954488 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.954460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" event={"ID":"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a","Type":"ContainerStarted","Data":"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11"} Apr 24 21:29:58.954589 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.954499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" event={"ID":"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a","Type":"ContainerStarted","Data":"ddc7b0ea465d2e3e49e1f2940b967649246ee820becf932596d91dae7bb478f5"} Apr 24 21:29:58.954645 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.954630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:29:58.977365 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:29:58.977298 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" podStartSLOduration=16.977285748 podStartE2EDuration="16.977285748s" podCreationTimestamp="2026-04-24 21:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:58.976214369 +0000 UTC m=+140.916060102" watchObservedRunningTime="2026-04-24 21:29:58.977285748 +0000 UTC m=+140.917131476" Apr 24 21:30:11.297936 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.297886 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:30:11.329137 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.329112 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d8fc7475c-md5gj"] Apr 24 21:30:11.333829 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.333813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.350113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.350094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d8fc7475c-md5gj"] Apr 24 21:30:11.395385 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.395357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fcwpj"] Apr 24 21:30:11.398397 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.398504 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-registry-certificates\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398544 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-trusted-ca\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398587 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-installation-pull-secrets\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398587 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-image-registry-private-configuration\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398680 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-bound-sa-token\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398680 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50576603-0528-4100-9713-5d6578a97229-ca-trust-extracted\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398757 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-registry-tls\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.398797 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.398768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6kk\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-kube-api-access-qp6kk\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.401729 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.401711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4xl79\"" Apr 24 21:30:11.401828 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.401812 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:30:11.402463 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.402448 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:30:11.409973 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.409952 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fcwpj"] Apr 24 21:30:11.499231 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-registry-certificates\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-trusted-ca\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-installation-pull-secrets\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-image-registry-private-configuration\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxpt\" (UniqueName: \"kubernetes.io/projected/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-api-access-wlxpt\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.499363 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.499651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-bound-sa-token\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-data-volume\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.499651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50576603-0528-4100-9713-5d6578a97229-ca-trust-extracted\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-registry-tls\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499841 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6kk\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-kube-api-access-qp6kk\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.499841 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-crio-socket\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.499940 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.499904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50576603-0528-4100-9713-5d6578a97229-ca-trust-extracted\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.500027 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.500008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-registry-certificates\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.500343 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.500321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50576603-0528-4100-9713-5d6578a97229-trusted-ca\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.501702 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.501678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-installation-pull-secrets\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.501857 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.501838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50576603-0528-4100-9713-5d6578a97229-image-registry-private-configuration\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.502046 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.502028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-registry-tls\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.508711 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.508693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-bound-sa-token\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.512407 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.512389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6kk\" (UniqueName: \"kubernetes.io/projected/50576603-0528-4100-9713-5d6578a97229-kube-api-access-qp6kk\") pod \"image-registry-6d8fc7475c-md5gj\" (UID: \"50576603-0528-4100-9713-5d6578a97229\") " pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.600197 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-crio-socket\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600287 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-crio-socket\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600287 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600287 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxpt\" (UniqueName: \"kubernetes.io/projected/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-api-access-wlxpt\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600454 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-data-volume\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600588 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-data-volume\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.600813 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.600794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.602310 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.602294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.614980 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.614954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxpt\" (UniqueName: \"kubernetes.io/projected/ba08d1de-7f2a-43fa-9f8c-c670824b9bdb-kube-api-access-wlxpt\") pod \"insights-runtime-extractor-fcwpj\" (UID: \"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb\") " pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.641733 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.641711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:11.707374 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.707347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fcwpj" Apr 24 21:30:11.774770 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.774694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d8fc7475c-md5gj"] Apr 24 21:30:11.780476 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:11.780412 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50576603_0528_4100_9713_5d6578a97229.slice/crio-dc84f44d9eeeac16b368a879a8382da4bed72f4800886412520217e71304b9a7 WatchSource:0}: Error finding container dc84f44d9eeeac16b368a879a8382da4bed72f4800886412520217e71304b9a7: Status 404 returned error can't find the container with id dc84f44d9eeeac16b368a879a8382da4bed72f4800886412520217e71304b9a7 Apr 24 21:30:11.844462 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.844440 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fcwpj"] Apr 24 21:30:11.847148 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:11.847119 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba08d1de_7f2a_43fa_9f8c_c670824b9bdb.slice/crio-efbf349c605ca33ff257656936247c7eef452851d40dcebe141114bb96baba9d WatchSource:0}: Error finding container efbf349c605ca33ff257656936247c7eef452851d40dcebe141114bb96baba9d: Status 404 returned error can't find the container with id efbf349c605ca33ff257656936247c7eef452851d40dcebe141114bb96baba9d Apr 24 21:30:11.986835 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.986807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fcwpj" event={"ID":"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb","Type":"ContainerStarted","Data":"bb67344a3906012400a3fa0044f08f20dfe9f3791c04c102d76d229d8d2ad0a8"} Apr 24 21:30:11.987008 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.986842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fcwpj" event={"ID":"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb","Type":"ContainerStarted","Data":"efbf349c605ca33ff257656936247c7eef452851d40dcebe141114bb96baba9d"} Apr 24 21:30:11.988037 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.988012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" event={"ID":"50576603-0528-4100-9713-5d6578a97229","Type":"ContainerStarted","Data":"0b0cb9b2bce6c1da9e5c43687d797f6e3e952f130c959378795c47c6f42bd95c"} Apr 24 21:30:11.988129 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.988043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" event={"ID":"50576603-0528-4100-9713-5d6578a97229","Type":"ContainerStarted","Data":"dc84f44d9eeeac16b368a879a8382da4bed72f4800886412520217e71304b9a7"} Apr 24 21:30:11.988194 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:11.988177 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:12.012272 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:12.012232 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" podStartSLOduration=1.012221382 podStartE2EDuration="1.012221382s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:12.007965156 +0000 UTC m=+153.947810890" watchObservedRunningTime="2026-04-24 21:30:12.012221382 +0000 UTC m=+153.952067122" Apr 24 21:30:12.992677 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:12.992643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fcwpj" event={"ID":"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb","Type":"ContainerStarted","Data":"b5d9386fe57f57967fb48ddf4d8cd13e5a32074adfa0cd5ce4026fbeb3737cc0"} Apr 24 21:30:13.514173 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:13.514129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8m598" podUID="4c8553a4-97bd-43aa-a9ab-7ccbb4358a98" Apr 24 21:30:13.528323 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:13.528286 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pzzlw" podUID="51fc9513-bf57-4b5f-9a7c-f7325f046b26" Apr 24 21:30:13.588851 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:13.588815 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l78bh" podUID="6ec8ce7f-d73f-4ff5-a981-9d84448a51a6" Apr 24 21:30:13.996277 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:13.996246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fcwpj" event={"ID":"ba08d1de-7f2a-43fa-9f8c-c670824b9bdb","Type":"ContainerStarted","Data":"d9c34710f3e47b2a97dca6f28f08b1533d8cdd553010773bbc376afc6645c19b"} Apr 24 21:30:13.996637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:13.996280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8m598" Apr 24 21:30:14.015110 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:14.015072 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fcwpj" podStartSLOduration=1.091188772 podStartE2EDuration="3.015063231s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="2026-04-24 21:30:11.894921752 +0000 UTC m=+153.834767467" lastFinishedPulling="2026-04-24 21:30:13.818796205 +0000 UTC m=+155.758641926" observedRunningTime="2026-04-24 21:30:14.013856174 +0000 UTC m=+155.953701907" watchObservedRunningTime="2026-04-24 21:30:14.015063231 +0000 UTC m=+155.954908963" Apr 24 21:30:18.452964 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.452931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:30:18.455192 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.455170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c8553a4-97bd-43aa-a9ab-7ccbb4358a98-metrics-tls\") pod \"dns-default-8m598\" (UID: \"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98\") " pod="openshift-dns/dns-default-8m598" Apr 24 21:30:18.499907 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.499883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:30:18.508121 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.508107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8m598" Apr 24 21:30:18.553641 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.553608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:30:18.557390 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.557350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51fc9513-bf57-4b5f-9a7c-f7325f046b26-cert\") pod \"ingress-canary-pzzlw\" (UID: \"51fc9513-bf57-4b5f-9a7c-f7325f046b26\") " pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:30:18.632252 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.632227 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8m598"] Apr 24 21:30:18.635864 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:18.635832 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8553a4_97bd_43aa_a9ab_7ccbb4358a98.slice/crio-e5c19991aa7cb8be7fd7fa395943994d017014503d69661b7621a4157b22c706 WatchSource:0}: Error finding container e5c19991aa7cb8be7fd7fa395943994d017014503d69661b7621a4157b22c706: Status 404 returned error can't find the container with id e5c19991aa7cb8be7fd7fa395943994d017014503d69661b7621a4157b22c706 Apr 24 21:30:18.725054 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.724991 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf"] Apr 24 21:30:18.730279 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.730261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.733089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.732961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:30:18.733089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.732962 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:30:18.733394 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.733375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:30:18.733755 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.733693 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:30:18.733952 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.733937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zsz7w\"" Apr 24 21:30:18.734316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.734296 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:30:18.745555 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.745505 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf"] Apr 24 21:30:18.754489 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.754071 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rg89r"] Apr 24 21:30:18.757930 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.757910 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pb4jj"] Apr 24 21:30:18.758081 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.758065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.761270 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.760973 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:30:18.761270 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.760974 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:30:18.761270 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.761175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:30:18.761270 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.761243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9mvpt\"" Apr 24 21:30:18.761545 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.761357 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.768662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.766137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:30:18.768662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.766226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:30:18.768662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.767782 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-lt8d4\"" Apr 24 21:30:18.768662 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.767949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:30:18.774461 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.774042 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pb4jj"] Apr 24 21:30:18.856380 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-root\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.856495 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-sys\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.856495 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsnt\" (UniqueName: \"kubernetes.io/projected/253e4ec4-590d-47fb-8e5f-d260cbf867f8-kube-api-access-bzsnt\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.856587 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.856587 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/253e4ec4-590d-47fb-8e5f-d260cbf867f8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.856655 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-textfile\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.856655 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24bd\" (UniqueName: \"kubernetes.io/projected/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-kube-api-access-h24bd\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.856733 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.856733 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.856806 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbjbw\" (UniqueName: \"kubernetes.io/projected/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-api-access-xbjbw\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.856852 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.856852 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.856931 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.856931 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-tls\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.857028 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-wtmp\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.857028 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.856998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/713b6f30-0ef6-4532-af69-cc0928983c5b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.857128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.857030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.857128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.857078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.857128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.857104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-metrics-client-ca\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.963205 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-sys\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsnt\" (UniqueName: \"kubernetes.io/projected/253e4ec4-590d-47fb-8e5f-d260cbf867f8-kube-api-access-bzsnt\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/253e4ec4-590d-47fb-8e5f-d260cbf867f8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-textfile\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h24bd\" (UniqueName: \"kubernetes.io/projected/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-kube-api-access-h24bd\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.963359 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.963660 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.963660 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbjbw\" (UniqueName: \"kubernetes.io/projected/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-api-access-xbjbw\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.963660 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:18.963604 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:30:18.963794 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:18.963662 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls podName:713b6f30-0ef6-4532-af69-cc0928983c5b nodeName:}" failed. No retries permitted until 2026-04-24 21:30:19.46364386 +0000 UTC m=+161.403489574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pb4jj" (UID: "713b6f30-0ef6-4532-af69-cc0928983c5b") : secret "kube-state-metrics-tls" not found Apr 24 21:30:18.963794 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-sys\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.963977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.964113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-textfile\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.963994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.964273 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.964273 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-tls\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964369 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-wtmp\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964369 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/713b6f30-0ef6-4532-af69-cc0928983c5b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.964369 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964369 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.964576 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-metrics-client-ca\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964576 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-root\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964576 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/253e4ec4-590d-47fb-8e5f-d260cbf867f8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.964576 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-root\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.964808 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.964786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/713b6f30-0ef6-4532-af69-cc0928983c5b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.965039 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.965022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-wtmp\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.965478 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.965454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.965858 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.965833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-metrics-client-ca\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.966213 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.966194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.967016 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.966994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.967106 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.967006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-node-exporter-tls\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:18.967541 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.967513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.967715 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.967694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.967803 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.967782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/253e4ec4-590d-47fb-8e5f-d260cbf867f8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.971854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.971833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsnt\" (UniqueName: \"kubernetes.io/projected/253e4ec4-590d-47fb-8e5f-d260cbf867f8-kube-api-access-bzsnt\") pod \"openshift-state-metrics-9d44df66c-lx2hf\" (UID: \"253e4ec4-590d-47fb-8e5f-d260cbf867f8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:18.971963 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.971939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbjbw\" (UniqueName: \"kubernetes.io/projected/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-api-access-xbjbw\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:18.972731 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:18.972713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24bd\" (UniqueName: \"kubernetes.io/projected/b62e611a-7e82-44ee-b32b-a1c65c0e67f3-kube-api-access-h24bd\") pod \"node-exporter-rg89r\" (UID: \"b62e611a-7e82-44ee-b32b-a1c65c0e67f3\") " pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:19.012464 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.012382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8m598" event={"ID":"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98","Type":"ContainerStarted","Data":"e5c19991aa7cb8be7fd7fa395943994d017014503d69661b7621a4157b22c706"} Apr 24 21:30:19.041637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.041608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" Apr 24 21:30:19.075464 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.075441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rg89r" Apr 24 21:30:19.088180 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:19.088160 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb62e611a_7e82_44ee_b32b_a1c65c0e67f3.slice/crio-0da32cac4219703713e34580766c6a93fba8bb198d6bd33808141b268562a471 WatchSource:0}: Error finding container 0da32cac4219703713e34580766c6a93fba8bb198d6bd33808141b268562a471: Status 404 returned error can't find the container with id 0da32cac4219703713e34580766c6a93fba8bb198d6bd33808141b268562a471 Apr 24 21:30:19.177698 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.177668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf"] Apr 24 21:30:19.180725 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:19.180697 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod253e4ec4_590d_47fb_8e5f_d260cbf867f8.slice/crio-d0842d80d49c447ed67e8fa6675fa082f3f8640eac67a94733fac737402729f8 WatchSource:0}: Error finding container d0842d80d49c447ed67e8fa6675fa082f3f8640eac67a94733fac737402729f8: Status 404 returned error can't find the container with id d0842d80d49c447ed67e8fa6675fa082f3f8640eac67a94733fac737402729f8 Apr 24 21:30:19.468710 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.468683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:19.470961 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.470942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/713b6f30-0ef6-4532-af69-cc0928983c5b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pb4jj\" (UID: \"713b6f30-0ef6-4532-af69-cc0928983c5b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:19.675887 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.675813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" Apr 24 21:30:19.747882 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.747857 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:30:19.751497 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.751469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.754658 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.754615 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:30:19.754854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.754837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:30:19.754961 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.754882 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:30:19.755026 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.754982 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z79q4\"" Apr 24 21:30:19.755072 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755053 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:30:19.755154 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755129 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:30:19.755272 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755222 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:30:19.755272 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755222 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:30:19.755272 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755247 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:30:19.755412 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.755350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:30:19.765256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.765238 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:30:19.871406 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.871933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.871676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzl2t\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972787 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.972933 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.972997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzl2t\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973189 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.973524 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.974109 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.974109 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.973917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.974701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.974368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976065 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976173 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976536 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976684 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976779 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.976779 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.976700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.977872 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.977816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.978510 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.978479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.978651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.978629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:19.981763 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:19.981740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzl2t\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t\") pod \"alertmanager-main-0\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:20.016957 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.016929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg89r" event={"ID":"b62e611a-7e82-44ee-b32b-a1c65c0e67f3","Type":"ContainerStarted","Data":"0da32cac4219703713e34580766c6a93fba8bb198d6bd33808141b268562a471"} Apr 24 21:30:20.018768 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.018740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" event={"ID":"253e4ec4-590d-47fb-8e5f-d260cbf867f8","Type":"ContainerStarted","Data":"26c6419c79b3b99aaacad35f821dace6f602d2725815b67ef20ed0f77c1ba6d2"} Apr 24 21:30:20.018861 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.018777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" event={"ID":"253e4ec4-590d-47fb-8e5f-d260cbf867f8","Type":"ContainerStarted","Data":"736cbb8d7f5b04e6c1405099e030405707a9d3b1f52d3fca15ca34b07dfa5706"} Apr 24 21:30:20.018861 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.018791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" event={"ID":"253e4ec4-590d-47fb-8e5f-d260cbf867f8","Type":"ContainerStarted","Data":"d0842d80d49c447ed67e8fa6675fa082f3f8640eac67a94733fac737402729f8"} Apr 24 21:30:20.062516 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.062496 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:20.204904 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.204474 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pb4jj"] Apr 24 21:30:20.208842 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:20.208813 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod713b6f30_0ef6_4532_af69_cc0928983c5b.slice/crio-98cb813f3650fd07a525ce3b231fe8242abde6c17e2ce4dd69c379d95b111aa1 WatchSource:0}: Error finding container 98cb813f3650fd07a525ce3b231fe8242abde6c17e2ce4dd69c379d95b111aa1: Status 404 returned error can't find the container with id 98cb813f3650fd07a525ce3b231fe8242abde6c17e2ce4dd69c379d95b111aa1 Apr 24 21:30:20.233861 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:20.233576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:30:20.237507 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:20.237479 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbf7f0b_8326_4a75_8f71_049bc66e101f.slice/crio-85036e007e32852b7890c5136a3c6e0ae5ea0dd3f3b87b177787a3828b6c3831 WatchSource:0}: Error finding container 85036e007e32852b7890c5136a3c6e0ae5ea0dd3f3b87b177787a3828b6c3831: Status 404 returned error can't find the container with id 85036e007e32852b7890c5136a3c6e0ae5ea0dd3f3b87b177787a3828b6c3831 Apr 24 21:30:21.024629 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.024584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8m598" event={"ID":"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98","Type":"ContainerStarted","Data":"ba684735c4a23b75bacd805b03c055483cde032105ee173ee18fa11147974c4c"} Apr 24 21:30:21.024629 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.024628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8m598" event={"ID":"4c8553a4-97bd-43aa-a9ab-7ccbb4358a98","Type":"ContainerStarted","Data":"a5f98b1ae049081c11329d46ab72a86b13c3fd0a40c757c0c8aaac55dc0a5c27"} Apr 24 21:30:21.025151 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.024765 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8m598" Apr 24 21:30:21.026389 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.026304 2575 generic.go:358] "Generic (PLEG): container finished" podID="b62e611a-7e82-44ee-b32b-a1c65c0e67f3" containerID="8225b1c52ba26424469e3af2c7c0b3eb586fc2a954be295419d2893953386b02" exitCode=0 Apr 24 21:30:21.026523 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.026389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg89r" event={"ID":"b62e611a-7e82-44ee-b32b-a1c65c0e67f3","Type":"ContainerDied","Data":"8225b1c52ba26424469e3af2c7c0b3eb586fc2a954be295419d2893953386b02"} Apr 24 21:30:21.028391 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.028350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" event={"ID":"253e4ec4-590d-47fb-8e5f-d260cbf867f8","Type":"ContainerStarted","Data":"214d66225d0aec5089b7e45c0ed4b3ef0fd2b3d17462ba4821af939bef460a6f"} Apr 24 21:30:21.029742 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.029694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"85036e007e32852b7890c5136a3c6e0ae5ea0dd3f3b87b177787a3828b6c3831"} Apr 24 21:30:21.030919 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.030895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" event={"ID":"713b6f30-0ef6-4532-af69-cc0928983c5b","Type":"ContainerStarted","Data":"98cb813f3650fd07a525ce3b231fe8242abde6c17e2ce4dd69c379d95b111aa1"} Apr 24 21:30:21.047151 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.047112 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8m598" podStartSLOduration=129.630879427 podStartE2EDuration="2m11.047101642s" podCreationTimestamp="2026-04-24 21:28:10 +0000 UTC" firstStartedPulling="2026-04-24 21:30:18.638069616 +0000 UTC m=+160.577915341" lastFinishedPulling="2026-04-24 21:30:20.05429183 +0000 UTC m=+161.994137556" observedRunningTime="2026-04-24 21:30:21.04573957 +0000 UTC m=+162.985585303" watchObservedRunningTime="2026-04-24 21:30:21.047101642 +0000 UTC m=+162.986947380" Apr 24 21:30:21.067095 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.067052 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lx2hf" podStartSLOduration=1.821151124 podStartE2EDuration="3.06703794s" podCreationTimestamp="2026-04-24 21:30:18 +0000 UTC" firstStartedPulling="2026-04-24 21:30:19.310287659 +0000 UTC m=+161.250133374" lastFinishedPulling="2026-04-24 21:30:20.556174476 +0000 UTC m=+162.496020190" observedRunningTime="2026-04-24 21:30:21.065722971 +0000 UTC m=+163.005568730" watchObservedRunningTime="2026-04-24 21:30:21.06703794 +0000 UTC m=+163.006883737" Apr 24 21:30:21.305404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.305322 2575 patch_prober.go:28] interesting pod/image-registry-5f95988bbc-b6txh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:30:21.305575 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:21.305390 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:22.035843 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.035812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg89r" event={"ID":"b62e611a-7e82-44ee-b32b-a1c65c0e67f3","Type":"ContainerStarted","Data":"ef0d3ac7619115654c11773de31da3ee015a2704e435ec2803759c8679d6b54f"} Apr 24 21:30:22.036277 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.035851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg89r" event={"ID":"b62e611a-7e82-44ee-b32b-a1c65c0e67f3","Type":"ContainerStarted","Data":"6d6fc3a70fe08abd2dbba8f1c5904f84dc2bfc42d04e0b4cd402658b0a244335"} Apr 24 21:30:22.037206 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.037179 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6" exitCode=0 Apr 24 21:30:22.037298 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.037267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6"} Apr 24 21:30:22.039193 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.039168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" event={"ID":"713b6f30-0ef6-4532-af69-cc0928983c5b","Type":"ContainerStarted","Data":"f923bc568449c1156b2f80bb1a0a62535adb7fbf1ec8c026628b9d1e590767a6"} Apr 24 21:30:22.039284 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.039202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" event={"ID":"713b6f30-0ef6-4532-af69-cc0928983c5b","Type":"ContainerStarted","Data":"455d83c801fbeb42b5c3ed7fefd21c47d54900964d1cc223637088696227745f"} Apr 24 21:30:22.039284 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.039220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" event={"ID":"713b6f30-0ef6-4532-af69-cc0928983c5b","Type":"ContainerStarted","Data":"6e922e607a14aa000471fe2566965e9bb4334e4725d43baf3fe1e1f57b89f222"} Apr 24 21:30:22.058807 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.058694 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rg89r" podStartSLOduration=3.096175475 podStartE2EDuration="4.058681107s" podCreationTimestamp="2026-04-24 21:30:18 +0000 UTC" firstStartedPulling="2026-04-24 21:30:19.092528056 +0000 UTC m=+161.032373767" lastFinishedPulling="2026-04-24 21:30:20.055033678 +0000 UTC m=+161.994879399" observedRunningTime="2026-04-24 21:30:22.058185408 +0000 UTC m=+163.998031140" watchObservedRunningTime="2026-04-24 21:30:22.058681107 +0000 UTC m=+163.998526841" Apr 24 21:30:22.105822 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.105730 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pb4jj" podStartSLOduration=2.778978291 podStartE2EDuration="4.105716743s" podCreationTimestamp="2026-04-24 21:30:18 +0000 UTC" firstStartedPulling="2026-04-24 21:30:20.210871311 +0000 UTC m=+162.150717022" lastFinishedPulling="2026-04-24 21:30:21.537609746 +0000 UTC m=+163.477455474" observedRunningTime="2026-04-24 21:30:22.10522874 +0000 UTC m=+164.045074477" watchObservedRunningTime="2026-04-24 21:30:22.105716743 +0000 UTC m=+164.045562476" Apr 24 21:30:22.850072 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.850003 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:22.853537 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.853510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:22.856182 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.856159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kvstk\"" Apr 24 21:30:22.856394 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.856372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:30:22.857572 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.857542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:30:22.857698 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.857574 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:30:22.857961 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.857941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:30:22.858047 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.857941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:30:22.858396 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.858378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:30:22.859016 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.858996 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:30:22.862831 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.862804 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:30:22.864392 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.864370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:22.999868 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.999841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.999887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.999907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.999934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000001 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:22.999992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvc77\" (UniqueName: \"kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000192 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.000023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.000192 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.000084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.100952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101013 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.100996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101474 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101474 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvc77\" (UniqueName: \"kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101474 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101474 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101474 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101760 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101950 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.101950 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.101909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.102232 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.102211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.103795 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.103774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.103881 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.103784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.110148 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.110109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvc77\" (UniqueName: \"kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77\") pod \"console-54bdb799b-9czmz\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.166055 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.166030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:23.453211 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.451910 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7"] Apr 24 21:30:23.455493 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.455473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:23.458276 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.458125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:30:23.458276 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.458169 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-xcj7c\"" Apr 24 21:30:23.466384 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.465339 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7"] Apr 24 21:30:23.515061 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.515030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:23.517618 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:23.517598 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef3157f_b15f_49ba_95e9_b598c0c61ccd.slice/crio-ad55d84bedc1735f29ad520e04595da464579ec8e7d910d973feb04427ca2e38 WatchSource:0}: Error finding container ad55d84bedc1735f29ad520e04595da464579ec8e7d910d973feb04427ca2e38: Status 404 returned error can't find the container with id ad55d84bedc1735f29ad520e04595da464579ec8e7d910d973feb04427ca2e38 Apr 24 21:30:23.606219 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.606194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9c27418-28ab-495b-b376-26606025a79e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5xpg7\" (UID: \"e9c27418-28ab-495b-b376-26606025a79e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:23.707158 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.707134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9c27418-28ab-495b-b376-26606025a79e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5xpg7\" (UID: \"e9c27418-28ab-495b-b376-26606025a79e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:23.709182 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.709159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9c27418-28ab-495b-b376-26606025a79e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5xpg7\" (UID: \"e9c27418-28ab-495b-b376-26606025a79e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:23.768702 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.768681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:23.885871 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:23.885845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7"] Apr 24 21:30:23.889617 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:23.887859 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c27418_28ab_495b_b376_26606025a79e.slice/crio-9d254fbf5e7a00d40486317dbd8c07ebe7dc0f7f789fa908044ae6ea5572c331 WatchSource:0}: Error finding container 9d254fbf5e7a00d40486317dbd8c07ebe7dc0f7f789fa908044ae6ea5572c331: Status 404 returned error can't find the container with id 9d254fbf5e7a00d40486317dbd8c07ebe7dc0f7f789fa908044ae6ea5572c331 Apr 24 21:30:24.048056 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.048021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a"} Apr 24 21:30:24.048056 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.048059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba"} Apr 24 21:30:24.048219 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.048073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a"} Apr 24 21:30:24.048219 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.048084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3"} Apr 24 21:30:24.048219 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.048094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33"} Apr 24 21:30:24.049221 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.049193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" event={"ID":"e9c27418-28ab-495b-b376-26606025a79e","Type":"ContainerStarted","Data":"9d254fbf5e7a00d40486317dbd8c07ebe7dc0f7f789fa908044ae6ea5572c331"} Apr 24 21:30:24.050281 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.050257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54bdb799b-9czmz" event={"ID":"2ef3157f-b15f-49ba-95e9-b598c0c61ccd","Type":"ContainerStarted","Data":"ad55d84bedc1735f29ad520e04595da464579ec8e7d910d973feb04427ca2e38"} Apr 24 21:30:24.578386 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.578359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:30:24.581617 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.581592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:30:24.588771 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.588749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzzlw" Apr 24 21:30:24.751717 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:24.751674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzzlw"] Apr 24 21:30:24.754871 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:24.754829 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51fc9513_bf57_4b5f_9a7c_f7325f046b26.slice/crio-ceb1d72bef13119cc982d80f9acb9f255d45b7a76d3c1ad7ff2c3b680261929d WatchSource:0}: Error finding container ceb1d72bef13119cc982d80f9acb9f255d45b7a76d3c1ad7ff2c3b680261929d: Status 404 returned error can't find the container with id ceb1d72bef13119cc982d80f9acb9f255d45b7a76d3c1ad7ff2c3b680261929d Apr 24 21:30:25.055373 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:25.055334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzzlw" event={"ID":"51fc9513-bf57-4b5f-9a7c-f7325f046b26","Type":"ContainerStarted","Data":"ceb1d72bef13119cc982d80f9acb9f255d45b7a76d3c1ad7ff2c3b680261929d"} Apr 24 21:30:25.058638 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:25.058611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerStarted","Data":"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d"} Apr 24 21:30:25.093548 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:25.093494 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.798162918 podStartE2EDuration="6.093479765s" podCreationTimestamp="2026-04-24 21:30:19 +0000 UTC" firstStartedPulling="2026-04-24 21:30:20.240190397 +0000 UTC m=+162.180036114" lastFinishedPulling="2026-04-24 21:30:24.53550725 +0000 UTC m=+166.475352961" observedRunningTime="2026-04-24 21:30:25.091961516 +0000 UTC m=+167.031807252" watchObservedRunningTime="2026-04-24 21:30:25.093479765 +0000 UTC m=+167.033325497" Apr 24 21:30:25.577896 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:25.577862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:30:27.064682 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.064657 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:27.108058 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.108026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:30:27.111243 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.111221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.126635 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.126608 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:30:27.240083 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240083 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240294 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240294 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvjh\" (UniqueName: \"kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240294 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240294 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.240484 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.240310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.340845 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.340819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.340962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.340854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.340962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.340898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.340962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.340921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvjh\" (UniqueName: \"kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.340962 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.340958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.341164 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.341105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.341219 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.341179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.341673 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.341651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.341785 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.341741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.342628 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.342605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.343356 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.343338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.343356 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.343354 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.343493 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.343478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.350077 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.350054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvjh\" (UniqueName: \"kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh\") pod \"console-6fd6f54958-48btq\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.421586 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.421552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:27.534542 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:27.534519 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:30:27.536254 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:30:27.536225 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b53f30_0bdb_4b79_9eed_e5697be23a61.slice/crio-5d144ff8cce0a71ab66c5057892848b01c8d64b81af83668bca264a3de8bd919 WatchSource:0}: Error finding container 5d144ff8cce0a71ab66c5057892848b01c8d64b81af83668bca264a3de8bd919: Status 404 returned error can't find the container with id 5d144ff8cce0a71ab66c5057892848b01c8d64b81af83668bca264a3de8bd919 Apr 24 21:30:28.074606 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.074568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" event={"ID":"e9c27418-28ab-495b-b376-26606025a79e","Type":"ContainerStarted","Data":"50ace5e0246fc27b7a8858e9b55adb74df7d292bb4d5bf9deab75907409253fa"} Apr 24 21:30:28.074994 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.074792 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:28.076251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.076209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd6f54958-48btq" event={"ID":"56b53f30-0bdb-4b79-9eed-e5697be23a61","Type":"ContainerStarted","Data":"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95"} Apr 24 21:30:28.076251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.076247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd6f54958-48btq" event={"ID":"56b53f30-0bdb-4b79-9eed-e5697be23a61","Type":"ContainerStarted","Data":"5d144ff8cce0a71ab66c5057892848b01c8d64b81af83668bca264a3de8bd919"} Apr 24 21:30:28.077772 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.077748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzzlw" event={"ID":"51fc9513-bf57-4b5f-9a7c-f7325f046b26","Type":"ContainerStarted","Data":"53b224c6a2ad41a6bb75618d7226f2f84ae98dce73303e8e767d383ef1cbeef2"} Apr 24 21:30:28.079407 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.079381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54bdb799b-9czmz" event={"ID":"2ef3157f-b15f-49ba-95e9-b598c0c61ccd","Type":"ContainerStarted","Data":"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1"} Apr 24 21:30:28.080135 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.080114 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" Apr 24 21:30:28.092012 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.091956 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5xpg7" podStartSLOduration=2.010416132 podStartE2EDuration="5.091945422s" podCreationTimestamp="2026-04-24 21:30:23 +0000 UTC" firstStartedPulling="2026-04-24 21:30:23.891767201 +0000 UTC m=+165.831612912" lastFinishedPulling="2026-04-24 21:30:26.973296491 +0000 UTC m=+168.913142202" observedRunningTime="2026-04-24 21:30:28.090563609 +0000 UTC m=+170.030409343" watchObservedRunningTime="2026-04-24 21:30:28.091945422 +0000 UTC m=+170.031791154" Apr 24 21:30:28.107849 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.107804 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54bdb799b-9czmz" podStartSLOduration=2.649837562 podStartE2EDuration="6.107793968s" podCreationTimestamp="2026-04-24 21:30:22 +0000 UTC" firstStartedPulling="2026-04-24 21:30:23.519411924 +0000 UTC m=+165.459257642" lastFinishedPulling="2026-04-24 21:30:26.977368336 +0000 UTC m=+168.917214048" observedRunningTime="2026-04-24 21:30:28.106480757 +0000 UTC m=+170.046326490" watchObservedRunningTime="2026-04-24 21:30:28.107793968 +0000 UTC m=+170.047639700" Apr 24 21:30:28.128505 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.128464 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd6f54958-48btq" podStartSLOduration=1.128453357 podStartE2EDuration="1.128453357s" podCreationTimestamp="2026-04-24 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:28.127099855 +0000 UTC m=+170.066945587" watchObservedRunningTime="2026-04-24 21:30:28.128453357 +0000 UTC m=+170.068299089" Apr 24 21:30:28.142222 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:28.142181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pzzlw" podStartSLOduration=135.922705654 podStartE2EDuration="2m18.142172113s" podCreationTimestamp="2026-04-24 21:28:10 +0000 UTC" firstStartedPulling="2026-04-24 21:30:24.758413075 +0000 UTC m=+166.698258791" lastFinishedPulling="2026-04-24 21:30:26.977879534 +0000 UTC m=+168.917725250" observedRunningTime="2026-04-24 21:30:28.141549608 +0000 UTC m=+170.081395341" watchObservedRunningTime="2026-04-24 21:30:28.142172113 +0000 UTC m=+170.082017845" Apr 24 21:30:31.041539 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:31.041507 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8m598" Apr 24 21:30:31.302110 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:31.302033 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:30:32.996767 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:32.996738 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d8fc7475c-md5gj" Apr 24 21:30:33.166155 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:33.166123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:35.706386 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:35.706350 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:30:36.316651 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.316588 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerName="registry" containerID="cri-o://3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11" gracePeriod=30 Apr 24 21:30:36.548028 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.548008 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:30:36.609995 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.609935 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.609995 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.609969 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2qtb\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610123 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610019 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610123 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610055 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610123 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610079 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610280 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610127 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610280 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610156 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610280 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610182 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca\") pod \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\" (UID: \"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a\") " Apr 24 21:30:36.610545 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610521 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:36.610790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.610764 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:36.612194 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.612171 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:36.612415 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.612385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:36.612527 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.612480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:36.612527 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.612499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb" (OuterVolumeSpecName: "kube-api-access-v2qtb") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "kube-api-access-v2qtb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:36.612743 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.612721 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:36.618664 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.618643 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" (UID: "5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:36.710950 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710912 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-installation-pull-secrets\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.710950 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710945 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2qtb\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-kube-api-access-v2qtb\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.710950 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710954 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-certificates\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.711262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710967 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-registry-tls\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.711262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710977 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-bound-sa-token\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.711262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710985 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-ca-trust-extracted\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.711262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.710995 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-image-registry-private-configuration\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.711262 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:36.711005 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a-trusted-ca\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:37.107403 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.107371 2575 generic.go:358] "Generic (PLEG): container finished" podID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerID="3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11" exitCode=0 Apr 24 21:30:37.107537 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.107460 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" Apr 24 21:30:37.107537 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.107471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" event={"ID":"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a","Type":"ContainerDied","Data":"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11"} Apr 24 21:30:37.107537 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.107518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f95988bbc-b6txh" event={"ID":"5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a","Type":"ContainerDied","Data":"ddc7b0ea465d2e3e49e1f2940b967649246ee820becf932596d91dae7bb478f5"} Apr 24 21:30:37.107637 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.107549 2575 scope.go:117] "RemoveContainer" containerID="3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11" Apr 24 21:30:37.115654 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.115631 2575 scope.go:117] "RemoveContainer" containerID="3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11" Apr 24 21:30:37.116005 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:37.115978 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11\": container with ID starting with 3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11 not found: ID does not exist" containerID="3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11" Apr 24 21:30:37.116069 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.116015 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11"} err="failed to get container status \"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11\": rpc error: code = NotFound desc = could not find container \"3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11\": container with ID starting with 3dea42a2d2395e2038a0db69da500ef05bdbba73dfcd320aa57d50cb50d67e11 not found: ID does not exist" Apr 24 21:30:37.132404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.132378 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:30:37.138604 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.138576 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f95988bbc-b6txh"] Apr 24 21:30:37.422673 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:37.422616 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:30:38.588102 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:38.588071 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" path="/var/lib/kubelet/pods/5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a/volumes" Apr 24 21:30:53.098130 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.098083 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54bdb799b-9czmz" podUID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" containerName="console" containerID="cri-o://e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1" gracePeriod=15 Apr 24 21:30:53.328321 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.328291 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54bdb799b-9czmz_2ef3157f-b15f-49ba-95e9-b598c0c61ccd/console/0.log" Apr 24 21:30:53.328459 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.328361 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:53.430787 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430718 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.430787 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430771 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.430787 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430792 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvc77\" (UniqueName: \"kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.431009 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430844 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.431009 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430866 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.431009 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430911 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.431009 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.430949 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca\") pod \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\" (UID: \"2ef3157f-b15f-49ba-95e9-b598c0c61ccd\") " Apr 24 21:30:53.431207 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.431151 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config" (OuterVolumeSpecName: "console-config") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:53.431259 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.431222 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:53.431259 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.431239 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:53.431518 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.431493 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca" (OuterVolumeSpecName: "service-ca") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:53.433035 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.433006 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:53.433138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.433073 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:53.433138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.433089 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77" (OuterVolumeSpecName: "kube-api-access-lvc77") pod "2ef3157f-b15f-49ba-95e9-b598c0c61ccd" (UID: "2ef3157f-b15f-49ba-95e9-b598c0c61ccd"). InnerVolumeSpecName "kube-api-access-lvc77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:53.532160 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532129 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-oauth-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532160 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532156 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532160 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532166 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-oauth-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532177 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-service-ca\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532186 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-console-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532194 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-trusted-ca-bundle\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:53.532319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:53.532202 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvc77\" (UniqueName: \"kubernetes.io/projected/2ef3157f-b15f-49ba-95e9-b598c0c61ccd-kube-api-access-lvc77\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:30:54.156959 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.156931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54bdb799b-9czmz_2ef3157f-b15f-49ba-95e9-b598c0c61ccd/console/0.log" Apr 24 21:30:54.157364 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.156978 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" containerID="e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1" exitCode=2 Apr 24 21:30:54.157364 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.157056 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54bdb799b-9czmz" Apr 24 21:30:54.157364 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.157061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54bdb799b-9czmz" event={"ID":"2ef3157f-b15f-49ba-95e9-b598c0c61ccd","Type":"ContainerDied","Data":"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1"} Apr 24 21:30:54.157364 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.157169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54bdb799b-9czmz" event={"ID":"2ef3157f-b15f-49ba-95e9-b598c0c61ccd","Type":"ContainerDied","Data":"ad55d84bedc1735f29ad520e04595da464579ec8e7d910d973feb04427ca2e38"} Apr 24 21:30:54.157364 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.157187 2575 scope.go:117] "RemoveContainer" containerID="e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1" Apr 24 21:30:54.166113 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.166094 2575 scope.go:117] "RemoveContainer" containerID="e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1" Apr 24 21:30:54.166359 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:30:54.166342 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1\": container with ID starting with e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1 not found: ID does not exist" containerID="e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1" Apr 24 21:30:54.166404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.166366 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1"} err="failed to get container status \"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1\": rpc error: code = NotFound desc = could not find container \"e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1\": container with ID starting with e68396e72862da424433f263cd3627fd4a328ff784b3714e1320576abdac52d1 not found: ID does not exist" Apr 24 21:30:54.180533 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.180505 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:54.185597 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.185571 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54bdb799b-9czmz"] Apr 24 21:30:54.583073 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:54.583029 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" path="/var/lib/kubelet/pods/2ef3157f-b15f-49ba-95e9-b598c0c61ccd/volumes" Apr 24 21:30:59.173918 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:59.173839 2575 generic.go:358] "Generic (PLEG): container finished" podID="114bfe15-0df7-402e-b377-0bf72321706b" containerID="690139a763246477579fc4b2213796275b6991744d35c34392f3a40022fc22d9" exitCode=0 Apr 24 21:30:59.173918 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:59.173883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" event={"ID":"114bfe15-0df7-402e-b377-0bf72321706b","Type":"ContainerDied","Data":"690139a763246477579fc4b2213796275b6991744d35c34392f3a40022fc22d9"} Apr 24 21:30:59.174324 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:30:59.174214 2575 scope.go:117] "RemoveContainer" containerID="690139a763246477579fc4b2213796275b6991744d35c34392f3a40022fc22d9" Apr 24 21:31:00.178003 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:00.177967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l5zd6" event={"ID":"114bfe15-0df7-402e-b377-0bf72321706b","Type":"ContainerStarted","Data":"3ffd9558166524e8570646c355330fe66792f964e390336b82c196a1bb7364d2"} Apr 24 21:31:00.725275 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:00.725231 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fd6f54958-48btq" podUID="56b53f30-0bdb-4b79-9eed-e5697be23a61" containerName="console" containerID="cri-o://8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95" gracePeriod=15 Apr 24 21:31:00.966713 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:00.966692 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd6f54958-48btq_56b53f30-0bdb-4b79-9eed-e5697be23a61/console/0.log" Apr 24 21:31:00.966838 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:00.966760 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:31:01.089407 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089348 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089407 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089379 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089570 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089415 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvjh\" (UniqueName: \"kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089570 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089478 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089570 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089521 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089570 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089556 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089727 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089615 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert\") pod \"56b53f30-0bdb-4b79-9eed-e5697be23a61\" (UID: \"56b53f30-0bdb-4b79-9eed-e5697be23a61\") " Apr 24 21:31:01.089783 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089752 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config" (OuterVolumeSpecName: "console-config") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:01.089838 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089805 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:01.089888 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089855 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-trusted-ca-bundle\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.089888 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.089870 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.090102 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.090066 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca" (OuterVolumeSpecName: "service-ca") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:01.090222 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.090121 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:01.091582 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.091561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:01.091974 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.091957 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh" (OuterVolumeSpecName: "kube-api-access-ltvjh") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "kube-api-access-ltvjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:01.092051 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.091966 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "56b53f30-0bdb-4b79-9eed-e5697be23a61" (UID: "56b53f30-0bdb-4b79-9eed-e5697be23a61"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:01.182025 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182007 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd6f54958-48btq_56b53f30-0bdb-4b79-9eed-e5697be23a61/console/0.log" Apr 24 21:31:01.182316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182042 2575 generic.go:358] "Generic (PLEG): container finished" podID="56b53f30-0bdb-4b79-9eed-e5697be23a61" containerID="8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95" exitCode=2 Apr 24 21:31:01.182316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd6f54958-48btq" event={"ID":"56b53f30-0bdb-4b79-9eed-e5697be23a61","Type":"ContainerDied","Data":"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95"} Apr 24 21:31:01.182316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182104 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd6f54958-48btq" Apr 24 21:31:01.182316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd6f54958-48btq" event={"ID":"56b53f30-0bdb-4b79-9eed-e5697be23a61","Type":"ContainerDied","Data":"5d144ff8cce0a71ab66c5057892848b01c8d64b81af83668bca264a3de8bd919"} Apr 24 21:31:01.182316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.182127 2575 scope.go:117] "RemoveContainer" containerID="8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95" Apr 24 21:31:01.189941 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.189920 2575 scope.go:117] "RemoveContainer" containerID="8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95" Apr 24 21:31:01.190228 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:01.190209 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95\": container with ID starting with 8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95 not found: ID does not exist" containerID="8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95" Apr 24 21:31:01.190307 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190242 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95"} err="failed to get container status \"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95\": rpc error: code = NotFound desc = could not find container \"8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95\": container with ID starting with 8472970c341f9ca9ca3de2dea261514de1970c7c4e8ec5d8fd9e0ef6bcdc4f95 not found: ID does not exist" Apr 24 21:31:01.190307 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190214 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-oauth-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.190307 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190282 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltvjh\" (UniqueName: \"kubernetes.io/projected/56b53f30-0bdb-4b79-9eed-e5697be23a61-kube-api-access-ltvjh\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.190307 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190299 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.190486 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190314 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56b53f30-0bdb-4b79-9eed-e5697be23a61-console-oauth-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.190486 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.190329 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56b53f30-0bdb-4b79-9eed-e5697be23a61-service-ca\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:01.206514 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.206496 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:31:01.210365 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:01.210348 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fd6f54958-48btq"] Apr 24 21:31:02.187856 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:02.187808 2575 generic.go:358] "Generic (PLEG): container finished" podID="d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f" containerID="9db1c70184c8c63d90359a59cfdd90b3293280e02148c74f181877190a0d0b25" exitCode=0 Apr 24 21:31:02.188356 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:02.187892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" event={"ID":"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f","Type":"ContainerDied","Data":"9db1c70184c8c63d90359a59cfdd90b3293280e02148c74f181877190a0d0b25"} Apr 24 21:31:02.188464 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:02.188352 2575 scope.go:117] "RemoveContainer" containerID="9db1c70184c8c63d90359a59cfdd90b3293280e02148c74f181877190a0d0b25" Apr 24 21:31:02.581807 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:02.581741 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b53f30-0bdb-4b79-9eed-e5697be23a61" path="/var/lib/kubelet/pods/56b53f30-0bdb-4b79-9eed-e5697be23a61/volumes" Apr 24 21:31:03.194224 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:03.194192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-szrrw" event={"ID":"d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f","Type":"ContainerStarted","Data":"95e6cba11502659af1dfb0fde08a5d649165ad3eaf6c35820be9d9590934ae5f"} Apr 24 21:31:14.229955 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:14.229920 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e9ba745-815f-4019-a172-e88557fff65c" containerID="2bccd1a6728fa83e818ca57c809d6f6e7055901a25dd0048346e875b525691fd" exitCode=0 Apr 24 21:31:14.230319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:14.229972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" event={"ID":"2e9ba745-815f-4019-a172-e88557fff65c","Type":"ContainerDied","Data":"2bccd1a6728fa83e818ca57c809d6f6e7055901a25dd0048346e875b525691fd"} Apr 24 21:31:14.230319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:14.230298 2575 scope.go:117] "RemoveContainer" containerID="2bccd1a6728fa83e818ca57c809d6f6e7055901a25dd0048346e875b525691fd" Apr 24 21:31:15.234584 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:15.234550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fbtc5" event={"ID":"2e9ba745-815f-4019-a172-e88557fff65c","Type":"ContainerStarted","Data":"ccdb2d6e1792aebf9f1ca4995f47b61c6ab23ae44088160a288edd8be74cf304"} Apr 24 21:31:39.435294 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.435258 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:39.435823 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.435751 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="alertmanager" containerID="cri-o://e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" gracePeriod=120 Apr 24 21:31:39.436008 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.435819 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-metric" containerID="cri-o://6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" gracePeriod=120 Apr 24 21:31:39.436008 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.435850 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-web" containerID="cri-o://9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" gracePeriod=120 Apr 24 21:31:39.436008 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.435865 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="prom-label-proxy" containerID="cri-o://e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" gracePeriod=120 Apr 24 21:31:39.436190 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.436018 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy" containerID="cri-o://5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" gracePeriod=120 Apr 24 21:31:39.436190 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:39.436105 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="config-reloader" containerID="cri-o://37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" gracePeriod=120 Apr 24 21:31:40.318790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318756 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" exitCode=0 Apr 24 21:31:40.318790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318784 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" exitCode=0 Apr 24 21:31:40.318790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318792 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" exitCode=0 Apr 24 21:31:40.318790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318798 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" exitCode=0 Apr 24 21:31:40.319032 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318838 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d"} Apr 24 21:31:40.319032 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba"} Apr 24 21:31:40.319032 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318890 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3"} Apr 24 21:31:40.319032 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.318902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33"} Apr 24 21:31:40.670237 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.670216 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:40.789002 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.788974 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789019 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789065 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789138 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789094 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789304 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789211 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzl2t\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789304 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789256 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789304 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789283 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789305 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789362 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789398 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789459 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789672 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789490 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789672 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789486 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:40.789672 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789541 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy\") pod \"adbf7f0b-8326-4a75-8f71-049bc66e101f\" (UID: \"adbf7f0b-8326-4a75-8f71-049bc66e101f\") " Apr 24 21:31:40.789828 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.789796 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.791054 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.790713 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:40.791054 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.791007 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:40.791675 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.791633 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume" (OuterVolumeSpecName: "config-volume") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.792734 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.792698 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.792916 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.792888 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t" (OuterVolumeSpecName: "kube-api-access-qzl2t") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "kube-api-access-qzl2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:40.792985 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.792939 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:40.793115 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.793096 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.793304 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.793288 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.793590 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.793575 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out" (OuterVolumeSpecName: "config-out") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:40.794030 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.794016 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.796383 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.796263 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.801683 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.801664 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config" (OuterVolumeSpecName: "web-config") pod "adbf7f0b-8326-4a75-8f71-049bc66e101f" (UID: "adbf7f0b-8326-4a75-8f71-049bc66e101f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:40.890256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890206 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-volume\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890227 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzl2t\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-kube-api-access-qzl2t\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890237 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890249 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adbf7f0b-8326-4a75-8f71-049bc66e101f-tls-assets\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890256 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890258 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adbf7f0b-8326-4a75-8f71-049bc66e101f-metrics-client-ca\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890267 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-web-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890276 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-alertmanager-main-db\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890284 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890293 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adbf7f0b-8326-4a75-8f71-049bc66e101f-config-out\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890302 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890312 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-cluster-tls-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:40.890467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:40.890320 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adbf7f0b-8326-4a75-8f71-049bc66e101f-secret-alertmanager-main-tls\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:31:41.324814 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324784 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" exitCode=0 Apr 24 21:31:41.324814 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324808 2575 generic.go:358] "Generic (PLEG): container finished" podID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerID="9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" exitCode=0 Apr 24 21:31:41.324958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a"} Apr 24 21:31:41.324958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a"} Apr 24 21:31:41.324958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adbf7f0b-8326-4a75-8f71-049bc66e101f","Type":"ContainerDied","Data":"85036e007e32852b7890c5136a3c6e0ae5ea0dd3f3b87b177787a3828b6c3831"} Apr 24 21:31:41.324958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324888 2575 scope.go:117] "RemoveContainer" containerID="e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" Apr 24 21:31:41.324958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.324926 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.333535 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.333490 2575 scope.go:117] "RemoveContainer" containerID="6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" Apr 24 21:31:41.340918 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.340900 2575 scope.go:117] "RemoveContainer" containerID="5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" Apr 24 21:31:41.347307 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.347286 2575 scope.go:117] "RemoveContainer" containerID="9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" Apr 24 21:31:41.351590 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.351563 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:41.354407 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.354387 2575 scope.go:117] "RemoveContainer" containerID="37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" Apr 24 21:31:41.358638 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.358613 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:41.361236 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.361223 2575 scope.go:117] "RemoveContainer" containerID="e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" Apr 24 21:31:41.367412 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.367395 2575 scope.go:117] "RemoveContainer" containerID="f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6" Apr 24 21:31:41.373382 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.373366 2575 scope.go:117] "RemoveContainer" containerID="e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" Apr 24 21:31:41.373632 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.373613 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d\": container with ID starting with e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d not found: ID does not exist" containerID="e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" Apr 24 21:31:41.373679 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.373639 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d"} err="failed to get container status \"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d\": rpc error: code = NotFound desc = could not find container \"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d\": container with ID starting with e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d not found: ID does not exist" Apr 24 21:31:41.373679 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.373657 2575 scope.go:117] "RemoveContainer" containerID="6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" Apr 24 21:31:41.373931 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.373913 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a\": container with ID starting with 6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a not found: ID does not exist" containerID="6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" Apr 24 21:31:41.373974 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.373937 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a"} err="failed to get container status \"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a\": rpc error: code = NotFound desc = could not find container \"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a\": container with ID starting with 6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a not found: ID does not exist" Apr 24 21:31:41.373974 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.373953 2575 scope.go:117] "RemoveContainer" containerID="5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" Apr 24 21:31:41.374183 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.374166 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba\": container with ID starting with 5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba not found: ID does not exist" containerID="5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" Apr 24 21:31:41.374226 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374191 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba"} err="failed to get container status \"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba\": rpc error: code = NotFound desc = could not find container \"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba\": container with ID starting with 5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba not found: ID does not exist" Apr 24 21:31:41.374226 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374208 2575 scope.go:117] "RemoveContainer" containerID="9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" Apr 24 21:31:41.374503 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.374476 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a\": container with ID starting with 9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a not found: ID does not exist" containerID="9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" Apr 24 21:31:41.374579 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374513 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a"} err="failed to get container status \"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a\": rpc error: code = NotFound desc = could not find container \"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a\": container with ID starting with 9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a not found: ID does not exist" Apr 24 21:31:41.374579 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374535 2575 scope.go:117] "RemoveContainer" containerID="37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" Apr 24 21:31:41.374782 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.374765 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3\": container with ID starting with 37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3 not found: ID does not exist" containerID="37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" Apr 24 21:31:41.374818 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374787 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3"} err="failed to get container status \"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3\": rpc error: code = NotFound desc = could not find container \"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3\": container with ID starting with 37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3 not found: ID does not exist" Apr 24 21:31:41.374818 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.374802 2575 scope.go:117] "RemoveContainer" containerID="e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" Apr 24 21:31:41.375026 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.375009 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33\": container with ID starting with e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33 not found: ID does not exist" containerID="e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" Apr 24 21:31:41.375076 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375027 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33"} err="failed to get container status \"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33\": rpc error: code = NotFound desc = could not find container \"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33\": container with ID starting with e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33 not found: ID does not exist" Apr 24 21:31:41.375076 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375039 2575 scope.go:117] "RemoveContainer" containerID="f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6" Apr 24 21:31:41.375272 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:31:41.375255 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6\": container with ID starting with f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6 not found: ID does not exist" containerID="f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6" Apr 24 21:31:41.375308 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375277 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6"} err="failed to get container status \"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6\": rpc error: code = NotFound desc = could not find container \"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6\": container with ID starting with f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6 not found: ID does not exist" Apr 24 21:31:41.375308 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375295 2575 scope.go:117] "RemoveContainer" containerID="e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d" Apr 24 21:31:41.375551 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375527 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d"} err="failed to get container status \"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d\": rpc error: code = NotFound desc = could not find container \"e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d\": container with ID starting with e832440214bdc41550049a3f1e0bc4862ae6a33cfcbda70025dde26746e78c2d not found: ID does not exist" Apr 24 21:31:41.375602 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375552 2575 scope.go:117] "RemoveContainer" containerID="6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a" Apr 24 21:31:41.375783 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375767 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a"} err="failed to get container status \"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a\": rpc error: code = NotFound desc = could not find container \"6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a\": container with ID starting with 6ba2a8c26359009d6aaf6bf4a012d069186b62480c7b7f4763bb1612f3d55c5a not found: ID does not exist" Apr 24 21:31:41.375835 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375782 2575 scope.go:117] "RemoveContainer" containerID="5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba" Apr 24 21:31:41.375991 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375975 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba"} err="failed to get container status \"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba\": rpc error: code = NotFound desc = could not find container \"5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba\": container with ID starting with 5cc33aefc59114416abbb2398eed783b6455aa70a7477d3029aab3c7b4dd44ba not found: ID does not exist" Apr 24 21:31:41.375991 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.375989 2575 scope.go:117] "RemoveContainer" containerID="9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a" Apr 24 21:31:41.376213 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376194 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a"} err="failed to get container status \"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a\": rpc error: code = NotFound desc = could not find container \"9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a\": container with ID starting with 9b685d499164ca3a7c86f7b27e2b469d221e5ec3ff1a8cd36d65c7acfa063c5a not found: ID does not exist" Apr 24 21:31:41.376275 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376214 2575 scope.go:117] "RemoveContainer" containerID="37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3" Apr 24 21:31:41.376460 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376440 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3"} err="failed to get container status \"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3\": rpc error: code = NotFound desc = could not find container \"37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3\": container with ID starting with 37a9270cf374759aa76ad1245a5461b5784fd91cbf3e31be35e95c0d17404ef3 not found: ID does not exist" Apr 24 21:31:41.376515 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376460 2575 scope.go:117] "RemoveContainer" containerID="e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33" Apr 24 21:31:41.376692 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376673 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33"} err="failed to get container status \"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33\": rpc error: code = NotFound desc = could not find container \"e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33\": container with ID starting with e16585af8d23b109352086eca199531df54869f7d04b261039e3731c3c81fc33 not found: ID does not exist" Apr 24 21:31:41.376732 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376693 2575 scope.go:117] "RemoveContainer" containerID="f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6" Apr 24 21:31:41.376924 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.376904 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6"} err="failed to get container status \"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6\": rpc error: code = NotFound desc = could not find container \"f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6\": container with ID starting with f4e236fe3d745a1c20b1526810651ddcbd2e2de70d9943ae8246ecb6631c8fc6 not found: ID does not exist" Apr 24 21:31:41.391492 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391465 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:41.391782 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391765 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-metric" Apr 24 21:31:41.391854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391783 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-metric" Apr 24 21:31:41.391854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391806 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="init-config-reloader" Apr 24 21:31:41.391854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391815 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="init-config-reloader" Apr 24 21:31:41.391854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391825 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy" Apr 24 21:31:41.391854 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391854 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391864 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="prom-label-proxy" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391872 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="prom-label-proxy" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391888 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" containerName="console" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391897 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" containerName="console" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391908 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-web" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391916 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-web" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391933 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="config-reloader" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391941 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="config-reloader" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391952 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56b53f30-0bdb-4b79-9eed-e5697be23a61" containerName="console" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391960 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b53f30-0bdb-4b79-9eed-e5697be23a61" containerName="console" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391971 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="alertmanager" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391979 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="alertmanager" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391989 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerName="registry" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.391997 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerName="registry" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392084 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5eb8c399-f9a0-4de2-bd75-f1bbf42ac96a" containerName="registry" Apr 24 21:31:41.392089 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392096 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="alertmanager" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392105 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="prom-label-proxy" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392115 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ef3157f-b15f-49ba-95e9-b598c0c61ccd" containerName="console" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392125 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392134 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-metric" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392145 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="56b53f30-0bdb-4b79-9eed-e5697be23a61" containerName="console" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392153 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="config-reloader" Apr 24 21:31:41.392860 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.392162 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" containerName="kube-rbac-proxy-web" Apr 24 21:31:41.398372 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.398350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.400778 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.400755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:31:41.400853 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.400755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:31:41.401449 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.401416 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:31:41.404077 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.404056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:31:41.404849 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.404635 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:31:41.404849 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.404730 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:31:41.404849 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.404751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:31:41.405010 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.404880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z79q4\"" Apr 24 21:31:41.405690 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.405286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:31:41.413777 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.413759 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:31:41.415338 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.415314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:41.493267 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-web-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493360 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493360 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493360 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zkc\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-kube-api-access-q6zkc\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493515 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493515 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493515 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493650 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493650 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493650 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493792 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493792 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-out\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.493792 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.493727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594524 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594524 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594524 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-out\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-web-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zkc\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-kube-api-access-q6zkc\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594701 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.594986 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.594891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.595477 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.595387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.596004 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.595978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598112 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.597878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-web-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598112 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.597992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-out\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598112 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598112 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598100 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598484 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598484 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598562 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.598562 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.598524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.599259 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.599241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.603661 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.603643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zkc\" (UniqueName: \"kubernetes.io/projected/b1dfa7d7-f1c8-4219-b234-5d14e7096ce6-kube-api-access-q6zkc\") pod \"alertmanager-main-0\" (UID: \"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.709390 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.709365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:41.835685 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:41.835655 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:41.838318 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:31:41.838296 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1dfa7d7_f1c8_4219_b234_5d14e7096ce6.slice/crio-687c6ed797cc32e8982d0b425bd53e9e31e23cf84020d03de7db1faddb9d309a WatchSource:0}: Error finding container 687c6ed797cc32e8982d0b425bd53e9e31e23cf84020d03de7db1faddb9d309a: Status 404 returned error can't find the container with id 687c6ed797cc32e8982d0b425bd53e9e31e23cf84020d03de7db1faddb9d309a Apr 24 21:31:42.330005 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:42.329973 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1dfa7d7-f1c8-4219-b234-5d14e7096ce6" containerID="e8367150624ef76cfab10d6648788a6fa367ab99114a380b5da1b458d10fc660" exitCode=0 Apr 24 21:31:42.330158 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:42.330048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerDied","Data":"e8367150624ef76cfab10d6648788a6fa367ab99114a380b5da1b458d10fc660"} Apr 24 21:31:42.330158 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:42.330071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"687c6ed797cc32e8982d0b425bd53e9e31e23cf84020d03de7db1faddb9d309a"} Apr 24 21:31:42.583098 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:42.583037 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbf7f0b-8326-4a75-8f71-049bc66e101f" path="/var/lib/kubelet/pods/adbf7f0b-8326-4a75-8f71-049bc66e101f/volumes" Apr 24 21:31:43.337044 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"95dc32dfbc26d994853b62f45b7944dbdc00287a9ebabaa2861c0707fdacc638"} Apr 24 21:31:43.337439 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"af256403177410104ab7f09b4f83d06fed25051d73d037108929cef74b7de639"} Apr 24 21:31:43.337439 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"96273a0f07ec7118dc3e47ef30f550a5e2ea589c7e89bbe442dbca7c3e6f0dfc"} Apr 24 21:31:43.337439 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"69400678e788c277b05d3a9d748c6fea2237292b358bc73695e43bb4e38efe14"} Apr 24 21:31:43.337439 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"ac0a349b55e47331fa2082be470e299e9a3992ef1bed246e191de2d66b3f65b3"} Apr 24 21:31:43.337439 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.337114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1dfa7d7-f1c8-4219-b234-5d14e7096ce6","Type":"ContainerStarted","Data":"53a75ea6250ed7cb1ebf153ef0a08860e3e0a17e8095581ec27885e7e356e86b"} Apr 24 21:31:43.361514 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.361486 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:31:43.364741 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.364722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.367240 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.367209 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:31:43.367349 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.367275 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:31:43.367414 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.367358 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kvstk\"" Apr 24 21:31:43.367743 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.367722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:31:43.367836 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.367753 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:31:43.368182 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.368162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:31:43.368261 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.368167 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:31:43.368669 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.368649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:31:43.373437 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.373396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:31:43.376990 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.376968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:31:43.390250 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.390171 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.390159096 podStartE2EDuration="2.390159096s" podCreationTimestamp="2026-04-24 21:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:43.388595898 +0000 UTC m=+245.328441631" watchObservedRunningTime="2026-04-24 21:31:43.390159096 +0000 UTC m=+245.330004829" Apr 24 21:31:43.510274 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510405 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510405 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfww4\" (UniqueName: \"kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510405 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510592 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510592 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.510769 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.510746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611497 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611497 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611639 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611639 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfww4\" (UniqueName: \"kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.611738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.611725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.612204 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.612175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.612311 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.612238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.612467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.612443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.612595 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.612577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.614120 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.614096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.614199 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.614168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.619551 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.619527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfww4\" (UniqueName: \"kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4\") pod \"console-5757f86cf4-s8bwr\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.676249 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.676227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:43.807856 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:43.807806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:31:43.807856 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:31:43.807841 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b687bba_4000_4cf4_abd1_c3b4696936ad.slice/crio-dda2442d518806b1ea924a0913c28355b8538f8479157488c4d8f00899571222 WatchSource:0}: Error finding container dda2442d518806b1ea924a0913c28355b8538f8479157488c4d8f00899571222: Status 404 returned error can't find the container with id dda2442d518806b1ea924a0913c28355b8538f8479157488c4d8f00899571222 Apr 24 21:31:44.341899 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:44.341855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5757f86cf4-s8bwr" event={"ID":"4b687bba-4000-4cf4-abd1-c3b4696936ad","Type":"ContainerStarted","Data":"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330"} Apr 24 21:31:44.341899 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:44.341901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5757f86cf4-s8bwr" event={"ID":"4b687bba-4000-4cf4-abd1-c3b4696936ad","Type":"ContainerStarted","Data":"dda2442d518806b1ea924a0913c28355b8538f8479157488c4d8f00899571222"} Apr 24 21:31:44.363708 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:44.363665 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5757f86cf4-s8bwr" podStartSLOduration=1.363650815 podStartE2EDuration="1.363650815s" podCreationTimestamp="2026-04-24 21:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:44.362212292 +0000 UTC m=+246.302058025" watchObservedRunningTime="2026-04-24 21:31:44.363650815 +0000 UTC m=+246.303496546" Apr 24 21:31:50.259689 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:50.259600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:31:50.262109 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:50.262081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ec8ce7f-d73f-4ff5-a981-9d84448a51a6-metrics-certs\") pod \"network-metrics-daemon-l78bh\" (UID: \"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6\") " pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:31:50.481540 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:50.481507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:31:50.489672 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:50.489652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l78bh" Apr 24 21:31:50.609981 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:50.609963 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l78bh"] Apr 24 21:31:50.612053 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:31:50.612027 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec8ce7f_d73f_4ff5_a981_9d84448a51a6.slice/crio-24e603bf7e453da0a2cb6b8bbf607d27ef840c9f52c0b351de53b06c6db2619b WatchSource:0}: Error finding container 24e603bf7e453da0a2cb6b8bbf607d27ef840c9f52c0b351de53b06c6db2619b: Status 404 returned error can't find the container with id 24e603bf7e453da0a2cb6b8bbf607d27ef840c9f52c0b351de53b06c6db2619b Apr 24 21:31:51.364361 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:51.364327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l78bh" event={"ID":"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6","Type":"ContainerStarted","Data":"24e603bf7e453da0a2cb6b8bbf607d27ef840c9f52c0b351de53b06c6db2619b"} Apr 24 21:31:52.368529 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:52.368497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l78bh" event={"ID":"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6","Type":"ContainerStarted","Data":"c2c42b8d394c5889b2f1bf6233c12d51787c6012d04dbe4f3ccf5db9f50e0720"} Apr 24 21:31:52.368529 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:52.368535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l78bh" event={"ID":"6ec8ce7f-d73f-4ff5-a981-9d84448a51a6","Type":"ContainerStarted","Data":"8e764d2d129d3c4c549cc02eed6e2e941a074362b0ca9add6d268fe55211aedf"} Apr 24 21:31:52.390386 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:52.390342 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l78bh" podStartSLOduration=253.450857772 podStartE2EDuration="4m14.390327233s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:31:50.613965787 +0000 UTC m=+252.553811501" lastFinishedPulling="2026-04-24 21:31:51.553435251 +0000 UTC m=+253.493280962" observedRunningTime="2026-04-24 21:31:52.390300222 +0000 UTC m=+254.330145954" watchObservedRunningTime="2026-04-24 21:31:52.390327233 +0000 UTC m=+254.330172965" Apr 24 21:31:53.677217 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:53.677185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:53.677698 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:53.677450 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:53.682168 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:53.682144 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:31:54.379141 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:31:54.379111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:32:16.086099 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.086058 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-f645d"] Apr 24 21:32:16.090710 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.090685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.101385 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.101366 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:32:16.108362 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.108338 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f645d"] Apr 24 21:32:16.130839 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.130813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-original-pull-secret\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.130956 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.130864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-dbus\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.131019 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.130952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-kubelet-config\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.231567 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.231532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-dbus\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.231666 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.231585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-kubelet-config\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.231666 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.231626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-original-pull-secret\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.231744 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.231717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-kubelet-config\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.231790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.231742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-dbus\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.233979 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.233961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f92f23e3-4a12-4a9e-b80d-b1da1f6662c0-original-pull-secret\") pod \"global-pull-secret-syncer-f645d\" (UID: \"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0\") " pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.399890 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.399827 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f645d" Apr 24 21:32:16.520911 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:16.520884 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f645d"] Apr 24 21:32:16.523022 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:32:16.522994 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92f23e3_4a12_4a9e_b80d_b1da1f6662c0.slice/crio-fcc89b85b0c5de07c56b606251151d3cae99742fd2ba5bf82e036ef7bf480803 WatchSource:0}: Error finding container fcc89b85b0c5de07c56b606251151d3cae99742fd2ba5bf82e036ef7bf480803: Status 404 returned error can't find the container with id fcc89b85b0c5de07c56b606251151d3cae99742fd2ba5bf82e036ef7bf480803 Apr 24 21:32:17.451154 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:17.451115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f645d" event={"ID":"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0","Type":"ContainerStarted","Data":"fcc89b85b0c5de07c56b606251151d3cae99742fd2ba5bf82e036ef7bf480803"} Apr 24 21:32:20.461891 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:20.461849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f645d" event={"ID":"f92f23e3-4a12-4a9e-b80d-b1da1f6662c0","Type":"ContainerStarted","Data":"fdfaf3dcda1a58cc579e1f8f9ab94ffd54cebe38a95f4fa3f8e15d586166b0c8"} Apr 24 21:32:20.495604 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:20.495552 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f645d" podStartSLOduration=1.184817563 podStartE2EDuration="4.495537424s" podCreationTimestamp="2026-04-24 21:32:16 +0000 UTC" firstStartedPulling="2026-04-24 21:32:16.524773781 +0000 UTC m=+278.464619493" lastFinishedPulling="2026-04-24 21:32:19.835493642 +0000 UTC m=+281.775339354" observedRunningTime="2026-04-24 21:32:20.494821492 +0000 UTC m=+282.434667225" watchObservedRunningTime="2026-04-24 21:32:20.495537424 +0000 UTC m=+282.435383156" Apr 24 21:32:33.648164 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.648122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh"] Apr 24 21:32:33.652676 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.652651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.655457 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.655434 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:32:33.655552 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.655433 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:32:33.655597 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.655579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-p7q8s\"" Apr 24 21:32:33.660679 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.660655 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh"] Apr 24 21:32:33.755496 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.755473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25h7f\" (UniqueName: \"kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.755595 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.755521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.755634 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.755601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.856414 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.856393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.856512 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.856447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.856512 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.856487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25h7f\" (UniqueName: \"kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.856848 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.856826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.856848 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.856839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.865511 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.865485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25h7f\" (UniqueName: \"kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:33.963458 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:33.963409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:34.085817 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:34.085794 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh"] Apr 24 21:32:34.087822 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:32:34.087795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda820ef6e_8ae5_4499_921f_60fc665fb5e8.slice/crio-a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f WatchSource:0}: Error finding container a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f: Status 404 returned error can't find the container with id a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f Apr 24 21:32:34.504145 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:34.504113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" event={"ID":"a820ef6e-8ae5-4499-921f-60fc665fb5e8","Type":"ContainerStarted","Data":"a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f"} Apr 24 21:32:38.734561 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:38.734497 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:32:38.735619 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:38.735597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:32:38.741600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:38.741583 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:39.521741 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:39.521706 2575 generic.go:358] "Generic (PLEG): container finished" podID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerID="b32659b0d3b964b5ca3422726ffe96bee915abd278e54694dda84d612fd23731" exitCode=0 Apr 24 21:32:39.521886 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:39.521759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" event={"ID":"a820ef6e-8ae5-4499-921f-60fc665fb5e8","Type":"ContainerDied","Data":"b32659b0d3b964b5ca3422726ffe96bee915abd278e54694dda84d612fd23731"} Apr 24 21:32:39.522692 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:39.522676 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:41.529784 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:41.529745 2575 generic.go:358] "Generic (PLEG): container finished" podID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerID="cb757fdfd39c4a84a0489c15d10cf01bc16b59102f13ab0a8c807f7f95306bed" exitCode=0 Apr 24 21:32:41.530170 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:41.529799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" event={"ID":"a820ef6e-8ae5-4499-921f-60fc665fb5e8","Type":"ContainerDied","Data":"cb757fdfd39c4a84a0489c15d10cf01bc16b59102f13ab0a8c807f7f95306bed"} Apr 24 21:32:47.550659 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:47.550613 2575 generic.go:358] "Generic (PLEG): container finished" podID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerID="d5dbf7a9450ebd0224f2500b22ac9067679962ed7078d0fb7f60c2420880d005" exitCode=0 Apr 24 21:32:47.550997 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:47.550752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" event={"ID":"a820ef6e-8ae5-4499-921f-60fc665fb5e8","Type":"ContainerDied","Data":"d5dbf7a9450ebd0224f2500b22ac9067679962ed7078d0fb7f60c2420880d005"} Apr 24 21:32:48.678799 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.678777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:48.771293 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.771271 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25h7f\" (UniqueName: \"kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f\") pod \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " Apr 24 21:32:48.771404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.771302 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle\") pod \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " Apr 24 21:32:48.771404 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.771327 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util\") pod \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\" (UID: \"a820ef6e-8ae5-4499-921f-60fc665fb5e8\") " Apr 24 21:32:48.771790 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.771768 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle" (OuterVolumeSpecName: "bundle") pod "a820ef6e-8ae5-4499-921f-60fc665fb5e8" (UID: "a820ef6e-8ae5-4499-921f-60fc665fb5e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:48.773432 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.773399 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f" (OuterVolumeSpecName: "kube-api-access-25h7f") pod "a820ef6e-8ae5-4499-921f-60fc665fb5e8" (UID: "a820ef6e-8ae5-4499-921f-60fc665fb5e8"). InnerVolumeSpecName "kube-api-access-25h7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:48.775183 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.775163 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util" (OuterVolumeSpecName: "util") pod "a820ef6e-8ae5-4499-921f-60fc665fb5e8" (UID: "a820ef6e-8ae5-4499-921f-60fc665fb5e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:48.871977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.871930 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-25h7f\" (UniqueName: \"kubernetes.io/projected/a820ef6e-8ae5-4499-921f-60fc665fb5e8-kube-api-access-25h7f\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:32:48.871977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.871949 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-bundle\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:32:48.871977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:48.871959 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a820ef6e-8ae5-4499-921f-60fc665fb5e8-util\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:32:49.559073 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:49.559034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" event={"ID":"a820ef6e-8ae5-4499-921f-60fc665fb5e8","Type":"ContainerDied","Data":"a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f"} Apr 24 21:32:49.559073 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:49.559069 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a432030d857217cc8f68e4c096919858b1d703460b5b01c45921045aeee53e7f" Apr 24 21:32:49.559247 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:49.559083 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7m8jh" Apr 24 21:32:55.718255 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718219 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr"] Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718548 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="extract" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718562 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="extract" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718577 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="pull" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718582 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="pull" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718599 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="util" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718604 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="util" Apr 24 21:32:55.718670 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.718649 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a820ef6e-8ae5-4499-921f-60fc665fb5e8" containerName="extract" Apr 24 21:32:55.766234 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.766204 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr"] Apr 24 21:32:55.766381 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.766301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:55.769055 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.769028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:32:55.769240 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.769034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:32:55.769240 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.769038 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-zx5dc\"" Apr 24 21:32:55.769240 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.769085 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:32:55.930263 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.930232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8n4z\" (UniqueName: \"kubernetes.io/projected/785243ed-9144-44f0-832b-a188b2080bbb-kube-api-access-r8n4z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:55.930385 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:55.930287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/785243ed-9144-44f0-832b-a188b2080bbb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.031181 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.031108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8n4z\" (UniqueName: \"kubernetes.io/projected/785243ed-9144-44f0-832b-a188b2080bbb-kube-api-access-r8n4z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.031181 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.031156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/785243ed-9144-44f0-832b-a188b2080bbb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.033514 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.033490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/785243ed-9144-44f0-832b-a188b2080bbb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.042655 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.042630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8n4z\" (UniqueName: \"kubernetes.io/projected/785243ed-9144-44f0-832b-a188b2080bbb-kube-api-access-r8n4z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr\" (UID: \"785243ed-9144-44f0-832b-a188b2080bbb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.076086 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.076065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:32:56.193623 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.193603 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr"] Apr 24 21:32:56.196379 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:32:56.196352 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785243ed_9144_44f0_832b_a188b2080bbb.slice/crio-80d26099a0fe68b8cccfb0d7dead7817a9a778b511e5889280f8688f987eeacc WatchSource:0}: Error finding container 80d26099a0fe68b8cccfb0d7dead7817a9a778b511e5889280f8688f987eeacc: Status 404 returned error can't find the container with id 80d26099a0fe68b8cccfb0d7dead7817a9a778b511e5889280f8688f987eeacc Apr 24 21:32:56.582132 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:32:56.582106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" event={"ID":"785243ed-9144-44f0-832b-a188b2080bbb","Type":"ContainerStarted","Data":"80d26099a0fe68b8cccfb0d7dead7817a9a778b511e5889280f8688f987eeacc"} Apr 24 21:33:00.180547 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.180513 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d6t4q"] Apr 24 21:33:00.198710 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.197951 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d6t4q"] Apr 24 21:33:00.198710 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.198077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.200913 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.200890 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:33:00.201063 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.201043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-4xsvc\"" Apr 24 21:33:00.201655 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.201638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:33:00.368371 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.368341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/236113b9-51e0-47b2-88de-f025f8df7a7a-cabundle0\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.368527 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.368384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chld\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-kube-api-access-8chld\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.368527 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.368479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.468975 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.468950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.469120 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.469007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/236113b9-51e0-47b2-88de-f025f8df7a7a-cabundle0\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.469120 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.469038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8chld\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-kube-api-access-8chld\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.469120 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.469090 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:33:00.469120 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.469107 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:33:00.469120 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.469118 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d6t4q: references non-existent secret key: ca.crt Apr 24 21:33:00.469382 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.469171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates podName:236113b9-51e0-47b2-88de-f025f8df7a7a nodeName:}" failed. No retries permitted until 2026-04-24 21:33:00.969154502 +0000 UTC m=+322.909000213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates") pod "keda-operator-ffbb595cb-d6t4q" (UID: "236113b9-51e0-47b2-88de-f025f8df7a7a") : references non-existent secret key: ca.crt Apr 24 21:33:00.469696 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.469674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/236113b9-51e0-47b2-88de-f025f8df7a7a-cabundle0\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.481177 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.481154 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l"] Apr 24 21:33:00.491027 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.491000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chld\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-kube-api-access-8chld\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.504602 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.504580 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l"] Apr 24 21:33:00.504714 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.504701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.508650 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.508626 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:33:00.596843 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.596818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" event={"ID":"785243ed-9144-44f0-832b-a188b2080bbb","Type":"ContainerStarted","Data":"d108f31541b6998cb12de17324ca52336fd9d6650204457b9ab93985069f4961"} Apr 24 21:33:00.596958 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.596864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:33:00.630517 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.630224 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" podStartSLOduration=2.207475498 podStartE2EDuration="5.630209417s" podCreationTimestamp="2026-04-24 21:32:55 +0000 UTC" firstStartedPulling="2026-04-24 21:32:56.198022216 +0000 UTC m=+318.137867928" lastFinishedPulling="2026-04-24 21:32:59.620756135 +0000 UTC m=+321.560601847" observedRunningTime="2026-04-24 21:33:00.625479734 +0000 UTC m=+322.565325466" watchObservedRunningTime="2026-04-24 21:33:00.630209417 +0000 UTC m=+322.570055150" Apr 24 21:33:00.671342 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.671315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.671450 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.671411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxwd\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-kube-api-access-tlxwd\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.671512 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.671486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5d813ff8-fb66-4a3c-807b-74c4ca04e040-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.772592 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.772532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxwd\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-kube-api-access-tlxwd\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.772592 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.772564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5d813ff8-fb66-4a3c-807b-74c4ca04e040-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.772744 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.772602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.773049 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.773025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5d813ff8-fb66-4a3c-807b-74c4ca04e040-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.773108 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.773056 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:33:00.773108 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.773071 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:33:00.773108 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.773085 2575 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 21:33:00.773108 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.773099 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:33:00.773298 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.773147 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates podName:5d813ff8-fb66-4a3c-807b-74c4ca04e040 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:01.273130158 +0000 UTC m=+323.212975870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates") pod "keda-metrics-apiserver-7c9f485588-gv45l" (UID: "5d813ff8-fb66-4a3c-807b-74c4ca04e040") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:33:00.786722 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.786695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxwd\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-kube-api-access-tlxwd\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:00.810695 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.810669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-kxwxn"] Apr 24 21:33:00.834872 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.834849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kxwxn"] Apr 24 21:33:00.834977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.834941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:00.838170 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.838149 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:33:00.974554 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.974525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:00.974689 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.974566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlmm\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-kube-api-access-gqlmm\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:00.974689 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.974673 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:33:00.974772 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.974691 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:33:00.974772 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.974701 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d6t4q: references non-existent secret key: ca.crt Apr 24 21:33:00.974772 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:00.974740 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates podName:236113b9-51e0-47b2-88de-f025f8df7a7a nodeName:}" failed. No retries permitted until 2026-04-24 21:33:01.974726535 +0000 UTC m=+323.914572251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates") pod "keda-operator-ffbb595cb-d6t4q" (UID: "236113b9-51e0-47b2-88de-f025f8df7a7a") : references non-existent secret key: ca.crt Apr 24 21:33:00.974772 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:00.974675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-certificates\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.075574 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.075513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-certificates\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.075574 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.075555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlmm\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-kube-api-access-gqlmm\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.077892 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.077875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-certificates\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.084852 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.084812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlmm\" (UniqueName: \"kubernetes.io/projected/d3eae593-ff88-4630-a639-34654f2d5f56-kube-api-access-gqlmm\") pod \"keda-admission-cf49989db-kxwxn\" (UID: \"d3eae593-ff88-4630-a639-34654f2d5f56\") " pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.147957 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.147927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:01.264132 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.264106 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kxwxn"] Apr 24 21:33:01.266365 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:33:01.266341 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3eae593_ff88_4630_a639_34654f2d5f56.slice/crio-99f18ef030240fc8c753616a752a0fdecd924aec332124f4497016e502437ac7 WatchSource:0}: Error finding container 99f18ef030240fc8c753616a752a0fdecd924aec332124f4497016e502437ac7: Status 404 returned error can't find the container with id 99f18ef030240fc8c753616a752a0fdecd924aec332124f4497016e502437ac7 Apr 24 21:33:01.277380 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.277360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:01.277524 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.277507 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:33:01.277567 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.277526 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:33:01.277567 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.277543 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l: references non-existent secret key: tls.crt Apr 24 21:33:01.277631 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.277587 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates podName:5d813ff8-fb66-4a3c-807b-74c4ca04e040 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:02.277573478 +0000 UTC m=+324.217419190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates") pod "keda-metrics-apiserver-7c9f485588-gv45l" (UID: "5d813ff8-fb66-4a3c-807b-74c4ca04e040") : references non-existent secret key: tls.crt Apr 24 21:33:01.601322 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.601287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kxwxn" event={"ID":"d3eae593-ff88-4630-a639-34654f2d5f56","Type":"ContainerStarted","Data":"99f18ef030240fc8c753616a752a0fdecd924aec332124f4497016e502437ac7"} Apr 24 21:33:01.984498 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:01.984456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:01.984659 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.984608 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:33:01.984659 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.984628 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:33:01.984659 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.984639 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d6t4q: references non-existent secret key: ca.crt Apr 24 21:33:01.984823 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:01.984707 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates podName:236113b9-51e0-47b2-88de-f025f8df7a7a nodeName:}" failed. No retries permitted until 2026-04-24 21:33:03.984687224 +0000 UTC m=+325.924532946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates") pod "keda-operator-ffbb595cb-d6t4q" (UID: "236113b9-51e0-47b2-88de-f025f8df7a7a") : references non-existent secret key: ca.crt Apr 24 21:33:02.287183 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:02.287105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:02.287661 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:02.287244 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:33:02.287661 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:02.287262 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:33:02.287661 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:02.287280 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l: references non-existent secret key: tls.crt Apr 24 21:33:02.287661 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:02.287347 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates podName:5d813ff8-fb66-4a3c-807b-74c4ca04e040 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:04.287326825 +0000 UTC m=+326.227172540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates") pod "keda-metrics-apiserver-7c9f485588-gv45l" (UID: "5d813ff8-fb66-4a3c-807b-74c4ca04e040") : references non-existent secret key: tls.crt Apr 24 21:33:03.610118 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:03.610086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kxwxn" event={"ID":"d3eae593-ff88-4630-a639-34654f2d5f56","Type":"ContainerStarted","Data":"90ec4a3def22bc29a8c92baa57306d89637d8a4ee5a089dc999566dfc4379f99"} Apr 24 21:33:03.610475 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:03.610298 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:03.627738 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:03.627689 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-kxwxn" podStartSLOduration=2.101043869 podStartE2EDuration="3.62767744s" podCreationTimestamp="2026-04-24 21:33:00 +0000 UTC" firstStartedPulling="2026-04-24 21:33:01.267688777 +0000 UTC m=+323.207534489" lastFinishedPulling="2026-04-24 21:33:02.794322344 +0000 UTC m=+324.734168060" observedRunningTime="2026-04-24 21:33:03.626287396 +0000 UTC m=+325.566133129" watchObservedRunningTime="2026-04-24 21:33:03.62767744 +0000 UTC m=+325.567523173" Apr 24 21:33:04.002470 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:04.002431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:04.002618 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.002523 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:33:04.002618 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.002539 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:33:04.002618 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.002550 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d6t4q: references non-existent secret key: ca.crt Apr 24 21:33:04.002618 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.002616 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates podName:236113b9-51e0-47b2-88de-f025f8df7a7a nodeName:}" failed. No retries permitted until 2026-04-24 21:33:08.002601585 +0000 UTC m=+329.942447312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates") pod "keda-operator-ffbb595cb-d6t4q" (UID: "236113b9-51e0-47b2-88de-f025f8df7a7a") : references non-existent secret key: ca.crt Apr 24 21:33:04.305444 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:04.305356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:04.305603 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.305491 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:33:04.305603 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.305506 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:33:04.305603 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.305522 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l: references non-existent secret key: tls.crt Apr 24 21:33:04.305603 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:33:04.305571 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates podName:5d813ff8-fb66-4a3c-807b-74c4ca04e040 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:08.305554705 +0000 UTC m=+330.245400427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates") pod "keda-metrics-apiserver-7c9f485588-gv45l" (UID: "5d813ff8-fb66-4a3c-807b-74c4ca04e040") : references non-existent secret key: tls.crt Apr 24 21:33:08.032771 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.032735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:08.035058 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.035038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/236113b9-51e0-47b2-88de-f025f8df7a7a-certificates\") pod \"keda-operator-ffbb595cb-d6t4q\" (UID: \"236113b9-51e0-47b2-88de-f025f8df7a7a\") " pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:08.309546 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.309465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:08.334349 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.334319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:08.337053 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.337020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5d813ff8-fb66-4a3c-807b-74c4ca04e040-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gv45l\" (UID: \"5d813ff8-fb66-4a3c-807b-74c4ca04e040\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:08.457864 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.457839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d6t4q"] Apr 24 21:33:08.460998 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:33:08.460972 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod236113b9_51e0_47b2_88de_f025f8df7a7a.slice/crio-7d9f3488600a65a4f874213077494a4dd0813e29cc147c42887f7fd0085cb17a WatchSource:0}: Error finding container 7d9f3488600a65a4f874213077494a4dd0813e29cc147c42887f7fd0085cb17a: Status 404 returned error can't find the container with id 7d9f3488600a65a4f874213077494a4dd0813e29cc147c42887f7fd0085cb17a Apr 24 21:33:08.616557 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.616505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:08.625924 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.625902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" event={"ID":"236113b9-51e0-47b2-88de-f025f8df7a7a","Type":"ContainerStarted","Data":"7d9f3488600a65a4f874213077494a4dd0813e29cc147c42887f7fd0085cb17a"} Apr 24 21:33:08.744094 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:08.744063 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l"] Apr 24 21:33:08.747243 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:33:08.747218 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d813ff8_fb66_4a3c_807b_74c4ca04e040.slice/crio-93885187e8489f4b49e921f7c653c15a5155a22ca64d43955ef11f9307587178 WatchSource:0}: Error finding container 93885187e8489f4b49e921f7c653c15a5155a22ca64d43955ef11f9307587178: Status 404 returned error can't find the container with id 93885187e8489f4b49e921f7c653c15a5155a22ca64d43955ef11f9307587178 Apr 24 21:33:09.631998 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:09.631957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" event={"ID":"5d813ff8-fb66-4a3c-807b-74c4ca04e040","Type":"ContainerStarted","Data":"93885187e8489f4b49e921f7c653c15a5155a22ca64d43955ef11f9307587178"} Apr 24 21:33:12.644556 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.644517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" event={"ID":"5d813ff8-fb66-4a3c-807b-74c4ca04e040","Type":"ContainerStarted","Data":"af4ea93ce1ebf72876f5da5cedd6048c57ee74eb81c26ead759ad6312a068927"} Apr 24 21:33:12.644556 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.644563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:12.646135 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.646110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" event={"ID":"236113b9-51e0-47b2-88de-f025f8df7a7a","Type":"ContainerStarted","Data":"a9fddd9886425545d314d63e5ab41991c5bcb79d352c87d822b26c8c3c2104e3"} Apr 24 21:33:12.646269 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.646247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:33:12.664335 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.664284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" podStartSLOduration=9.560471437 podStartE2EDuration="12.664267561s" podCreationTimestamp="2026-04-24 21:33:00 +0000 UTC" firstStartedPulling="2026-04-24 21:33:08.748500089 +0000 UTC m=+330.688345800" lastFinishedPulling="2026-04-24 21:33:11.852296196 +0000 UTC m=+333.792141924" observedRunningTime="2026-04-24 21:33:12.663551335 +0000 UTC m=+334.603397105" watchObservedRunningTime="2026-04-24 21:33:12.664267561 +0000 UTC m=+334.604113295" Apr 24 21:33:12.682920 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:12.682857 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" podStartSLOduration=9.293528621 podStartE2EDuration="12.682841099s" podCreationTimestamp="2026-04-24 21:33:00 +0000 UTC" firstStartedPulling="2026-04-24 21:33:08.462275133 +0000 UTC m=+330.402120847" lastFinishedPulling="2026-04-24 21:33:11.851587613 +0000 UTC m=+333.791433325" observedRunningTime="2026-04-24 21:33:12.681414647 +0000 UTC m=+334.621260380" watchObservedRunningTime="2026-04-24 21:33:12.682841099 +0000 UTC m=+334.622686882" Apr 24 21:33:21.604042 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:21.604007 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-cdxkr" Apr 24 21:33:23.654918 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:23.654880 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gv45l" Apr 24 21:33:24.614883 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:24.614846 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-kxwxn" Apr 24 21:33:33.653217 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:33:33.653177 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-d6t4q" Apr 24 21:34:16.819021 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.818990 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:16.822472 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.822453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.826401 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.826372 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:34:16.826555 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.826406 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:34:16.826652 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.826564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-lr2gp\"" Apr 24 21:34:16.826652 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.826636 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:34:16.835008 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.834988 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt"] Apr 24 21:34:16.838074 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.838055 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:16.838187 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.838142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:16.840874 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.840856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-vxkxp\"" Apr 24 21:34:16.841118 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.841101 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:34:16.845463 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.845155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.845463 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.845314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22cvr\" (UniqueName: \"kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.848588 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.848555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt"] Apr 24 21:34:16.946260 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.946235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2vr\" (UniqueName: \"kubernetes.io/projected/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-kube-api-access-qc2vr\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:16.946398 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.946275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22cvr\" (UniqueName: \"kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.946398 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.946304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.946398 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.946321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:16.948646 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.948624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:16.968210 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:16.968184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22cvr\" (UniqueName: \"kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr\") pod \"kserve-controller-manager-84b6647887-xrn5p\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:17.047388 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.047357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.047492 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.047447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2vr\" (UniqueName: \"kubernetes.io/projected/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-kube-api-access-qc2vr\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.047539 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:34:17.047509 2575 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 21:34:17.047590 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:34:17.047580 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert podName:41b38f4c-58f1-4d05-b65f-38c0b99bd1cd nodeName:}" failed. No retries permitted until 2026-04-24 21:34:17.547561316 +0000 UTC m=+399.487407041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert") pod "llmisvc-controller-manager-68cc5db7c4-tnrjt" (UID: "41b38f4c-58f1-4d05-b65f-38c0b99bd1cd") : secret "llmisvc-webhook-server-cert" not found Apr 24 21:34:17.070303 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.070251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2vr\" (UniqueName: \"kubernetes.io/projected/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-kube-api-access-qc2vr\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.134115 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.134096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:17.254646 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.254621 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:17.255850 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:34:17.255824 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdc21ac_4cce_47f6_b763_24b3ffc47c93.slice/crio-7ef139478e30513c55b06b05e1197be884b0a32a4676496bada1e8e6031dc18e WatchSource:0}: Error finding container 7ef139478e30513c55b06b05e1197be884b0a32a4676496bada1e8e6031dc18e: Status 404 returned error can't find the container with id 7ef139478e30513c55b06b05e1197be884b0a32a4676496bada1e8e6031dc18e Apr 24 21:34:17.551910 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.551886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.554331 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.554306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41b38f4c-58f1-4d05-b65f-38c0b99bd1cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tnrjt\" (UID: \"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.749748 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.749719 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:17.871105 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.870978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" event={"ID":"0bdc21ac-4cce-47f6-b763-24b3ffc47c93","Type":"ContainerStarted","Data":"7ef139478e30513c55b06b05e1197be884b0a32a4676496bada1e8e6031dc18e"} Apr 24 21:34:17.908343 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:34:17.908313 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41b38f4c_58f1_4d05_b65f_38c0b99bd1cd.slice/crio-006627e8644e02dc77e0d42c7eca565ea6dac284579f4411dff0f4fddb30c133 WatchSource:0}: Error finding container 006627e8644e02dc77e0d42c7eca565ea6dac284579f4411dff0f4fddb30c133: Status 404 returned error can't find the container with id 006627e8644e02dc77e0d42c7eca565ea6dac284579f4411dff0f4fddb30c133 Apr 24 21:34:17.913323 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:17.913299 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt"] Apr 24 21:34:18.876144 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:18.876101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" event={"ID":"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd","Type":"ContainerStarted","Data":"006627e8644e02dc77e0d42c7eca565ea6dac284579f4411dff0f4fddb30c133"} Apr 24 21:34:20.883635 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:20.883599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" event={"ID":"0bdc21ac-4cce-47f6-b763-24b3ffc47c93","Type":"ContainerStarted","Data":"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef"} Apr 24 21:34:20.884043 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:20.883781 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:20.906500 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:20.906459 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" podStartSLOduration=2.283322767 podStartE2EDuration="4.906445635s" podCreationTimestamp="2026-04-24 21:34:16 +0000 UTC" firstStartedPulling="2026-04-24 21:34:17.257185505 +0000 UTC m=+399.197031217" lastFinishedPulling="2026-04-24 21:34:19.88030837 +0000 UTC m=+401.820154085" observedRunningTime="2026-04-24 21:34:20.903899467 +0000 UTC m=+402.843745201" watchObservedRunningTime="2026-04-24 21:34:20.906445635 +0000 UTC m=+402.846291361" Apr 24 21:34:21.888312 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:21.888260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" event={"ID":"41b38f4c-58f1-4d05-b65f-38c0b99bd1cd","Type":"ContainerStarted","Data":"25e0bc67d60ab0f04615f5aee14aa58c76f38e70b15c65ebb68829d72c3a8ad5"} Apr 24 21:34:21.888776 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:21.888472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:21.912192 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:21.912150 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" podStartSLOduration=2.887703644 podStartE2EDuration="5.912137014s" podCreationTimestamp="2026-04-24 21:34:16 +0000 UTC" firstStartedPulling="2026-04-24 21:34:17.910172065 +0000 UTC m=+399.850017780" lastFinishedPulling="2026-04-24 21:34:20.934605435 +0000 UTC m=+402.874451150" observedRunningTime="2026-04-24 21:34:21.910081221 +0000 UTC m=+403.849926955" watchObservedRunningTime="2026-04-24 21:34:21.912137014 +0000 UTC m=+403.851982746" Apr 24 21:34:51.893066 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:51.892993 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:52.893778 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:52.893746 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tnrjt" Apr 24 21:34:54.242330 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.242289 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:54.242787 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.242571 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" podUID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" containerName="manager" containerID="cri-o://4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef" gracePeriod=10 Apr 24 21:34:54.268903 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.268876 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-n6kbk"] Apr 24 21:34:54.272034 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.272016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.281970 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.281945 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-n6kbk"] Apr 24 21:34:54.321502 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.321482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8c0694b-1dac-4f93-b968-338affc8e878-cert\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.321594 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.321511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9gb\" (UniqueName: \"kubernetes.io/projected/d8c0694b-1dac-4f93-b968-338affc8e878-kube-api-access-hp9gb\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.423039 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.423008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8c0694b-1dac-4f93-b968-338affc8e878-cert\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.423147 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.423059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9gb\" (UniqueName: \"kubernetes.io/projected/d8c0694b-1dac-4f93-b968-338affc8e878-kube-api-access-hp9gb\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.425300 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.425275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8c0694b-1dac-4f93-b968-338affc8e878-cert\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.431186 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.431164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9gb\" (UniqueName: \"kubernetes.io/projected/d8c0694b-1dac-4f93-b968-338affc8e878-kube-api-access-hp9gb\") pod \"kserve-controller-manager-84b6647887-n6kbk\" (UID: \"d8c0694b-1dac-4f93-b968-338affc8e878\") " pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.479028 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.479010 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:54.616736 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.616674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:54.625405 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.625384 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22cvr\" (UniqueName: \"kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr\") pod \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " Apr 24 21:34:54.625544 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.625495 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert\") pod \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\" (UID: \"0bdc21ac-4cce-47f6-b763-24b3ffc47c93\") " Apr 24 21:34:54.627411 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.627383 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert" (OuterVolumeSpecName: "cert") pod "0bdc21ac-4cce-47f6-b763-24b3ffc47c93" (UID: "0bdc21ac-4cce-47f6-b763-24b3ffc47c93"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:54.627521 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.627448 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr" (OuterVolumeSpecName: "kube-api-access-22cvr") pod "0bdc21ac-4cce-47f6-b763-24b3ffc47c93" (UID: "0bdc21ac-4cce-47f6-b763-24b3ffc47c93"). InnerVolumeSpecName "kube-api-access-22cvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:54.726319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.726295 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22cvr\" (UniqueName: \"kubernetes.io/projected/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-kube-api-access-22cvr\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:34:54.726319 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.726320 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bdc21ac-4cce-47f6-b763-24b3ffc47c93-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:34:54.733940 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:54.733919 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-n6kbk"] Apr 24 21:34:54.735887 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:34:54.735865 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c0694b_1dac_4f93_b968_338affc8e878.slice/crio-8ce4605ba83f7e7c28c9822ad35ffa8e21f65f275104e714593013b144cd36db WatchSource:0}: Error finding container 8ce4605ba83f7e7c28c9822ad35ffa8e21f65f275104e714593013b144cd36db: Status 404 returned error can't find the container with id 8ce4605ba83f7e7c28c9822ad35ffa8e21f65f275104e714593013b144cd36db Apr 24 21:34:55.004128 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.004098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" event={"ID":"d8c0694b-1dac-4f93-b968-338affc8e878","Type":"ContainerStarted","Data":"8ce4605ba83f7e7c28c9822ad35ffa8e21f65f275104e714593013b144cd36db"} Apr 24 21:34:55.005195 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.005165 2575 generic.go:358] "Generic (PLEG): container finished" podID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" containerID="4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef" exitCode=0 Apr 24 21:34:55.005285 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.005242 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" Apr 24 21:34:55.005285 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.005247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" event={"ID":"0bdc21ac-4cce-47f6-b763-24b3ffc47c93","Type":"ContainerDied","Data":"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef"} Apr 24 21:34:55.005285 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.005280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xrn5p" event={"ID":"0bdc21ac-4cce-47f6-b763-24b3ffc47c93","Type":"ContainerDied","Data":"7ef139478e30513c55b06b05e1197be884b0a32a4676496bada1e8e6031dc18e"} Apr 24 21:34:55.005396 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.005295 2575 scope.go:117] "RemoveContainer" containerID="4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef" Apr 24 21:34:55.013028 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.013011 2575 scope.go:117] "RemoveContainer" containerID="4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef" Apr 24 21:34:55.013268 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:34:55.013247 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef\": container with ID starting with 4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef not found: ID does not exist" containerID="4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef" Apr 24 21:34:55.013327 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.013275 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef"} err="failed to get container status \"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef\": rpc error: code = NotFound desc = could not find container \"4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef\": container with ID starting with 4cc6f8d8b7afa4517a458fceb47b158bf3f2b2bebd07503bf4314877486494ef not found: ID does not exist" Apr 24 21:34:55.032300 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.032269 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:55.036508 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:55.036486 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xrn5p"] Apr 24 21:34:56.010531 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:56.010494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" event={"ID":"d8c0694b-1dac-4f93-b968-338affc8e878","Type":"ContainerStarted","Data":"9fe9ae3cafff5907f6d3cc20709af0c482a7ec8ddab3c6f45dc2b636d31ef8f9"} Apr 24 21:34:56.010937 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:56.010558 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:34:56.032541 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:56.032487 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" podStartSLOduration=1.615915531 podStartE2EDuration="2.032469313s" podCreationTimestamp="2026-04-24 21:34:54 +0000 UTC" firstStartedPulling="2026-04-24 21:34:54.736998961 +0000 UTC m=+436.676844672" lastFinishedPulling="2026-04-24 21:34:55.153552739 +0000 UTC m=+437.093398454" observedRunningTime="2026-04-24 21:34:56.031759965 +0000 UTC m=+437.971606036" watchObservedRunningTime="2026-04-24 21:34:56.032469313 +0000 UTC m=+437.972315046" Apr 24 21:34:56.582467 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:34:56.582409 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" path="/var/lib/kubelet/pods/0bdc21ac-4cce-47f6-b763-24b3ffc47c93/volumes" Apr 24 21:35:27.021052 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:27.021022 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-n6kbk" Apr 24 21:35:48.723910 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.723879 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75b4b56f7b-pb7bx"] Apr 24 21:35:48.724336 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.724223 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" containerName="manager" Apr 24 21:35:48.724336 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.724234 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" containerName="manager" Apr 24 21:35:48.724336 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.724293 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bdc21ac-4cce-47f6-b763-24b3ffc47c93" containerName="manager" Apr 24 21:35:48.726927 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.726912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.742232 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.742211 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75b4b56f7b-pb7bx"] Apr 24 21:35:48.915176 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915297 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915297 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-oauth-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915297 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-trusted-ca-bundle\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915297 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-service-ca\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915457 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chfnn\" (UniqueName: \"kubernetes.io/projected/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-kube-api-access-chfnn\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:48.915457 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:48.915390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-oauth-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016214 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016214 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-oauth-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016214 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-trusted-ca-bundle\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016375 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-service-ca\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016488 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chfnn\" (UniqueName: \"kubernetes.io/projected/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-kube-api-access-chfnn\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016586 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-oauth-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.016643 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.016611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.017044 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.017019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-service-ca\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.017148 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.017089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-oauth-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.017148 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.017131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-trusted-ca-bundle\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.017271 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.017230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.018894 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.018867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-serving-cert\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.019000 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.018950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-console-oauth-config\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.025869 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.025845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chfnn\" (UniqueName: \"kubernetes.io/projected/8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a-kube-api-access-chfnn\") pod \"console-75b4b56f7b-pb7bx\" (UID: \"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a\") " pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.036605 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.036582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:49.159618 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.159594 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75b4b56f7b-pb7bx"] Apr 24 21:35:49.162373 ip-10-0-136-160 kubenswrapper[2575]: W0424 21:35:49.162340 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b74e0af_6580_4a3f_8ecf_c616e5ba0f0a.slice/crio-36d2efc069d6f33cd4b722983b65a936b3be45d733f280ead4a77afa77af4f38 WatchSource:0}: Error finding container 36d2efc069d6f33cd4b722983b65a936b3be45d733f280ead4a77afa77af4f38: Status 404 returned error can't find the container with id 36d2efc069d6f33cd4b722983b65a936b3be45d733f280ead4a77afa77af4f38 Apr 24 21:35:49.198293 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:49.198266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b4b56f7b-pb7bx" event={"ID":"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a","Type":"ContainerStarted","Data":"36d2efc069d6f33cd4b722983b65a936b3be45d733f280ead4a77afa77af4f38"} Apr 24 21:35:50.204263 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:50.204228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75b4b56f7b-pb7bx" event={"ID":"8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a","Type":"ContainerStarted","Data":"6ad79dbb7940cc544b4ff4eb47cf646623f8f82f3a14689864bab097b3e943ea"} Apr 24 21:35:50.226235 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:50.226179 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75b4b56f7b-pb7bx" podStartSLOduration=2.226161856 podStartE2EDuration="2.226161856s" podCreationTimestamp="2026-04-24 21:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:50.224317557 +0000 UTC m=+492.164163290" watchObservedRunningTime="2026-04-24 21:35:50.226161856 +0000 UTC m=+492.166007586" Apr 24 21:35:59.037743 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:59.037706 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:59.037743 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:59.037747 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:59.042251 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:59.042227 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:59.239506 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:59.239479 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75b4b56f7b-pb7bx" Apr 24 21:35:59.288122 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:35:59.288044 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:36:24.309096 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.309030 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5757f86cf4-s8bwr" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerName="console" containerID="cri-o://d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330" gracePeriod=15 Apr 24 21:36:24.375905 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.375875 2575 patch_prober.go:28] interesting pod/console-5757f86cf4-s8bwr container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" start-of-body= Apr 24 21:36:24.376029 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.375934 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5757f86cf4-s8bwr" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerName="console" probeResult="failure" output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" Apr 24 21:36:24.543483 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.543454 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5757f86cf4-s8bwr_4b687bba-4000-4cf4-abd1-c3b4696936ad/console/0.log" Apr 24 21:36:24.543593 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.543516 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:36:24.678286 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678228 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678286 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678273 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678471 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678309 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678471 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfww4\" (UniqueName: \"kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678471 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678378 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678471 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678437 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678471 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert\") pod \"4b687bba-4000-4cf4-abd1-c3b4696936ad\" (UID: \"4b687bba-4000-4cf4-abd1-c3b4696936ad\") " Apr 24 21:36:24.678721 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678680 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca" (OuterVolumeSpecName: "service-ca") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:24.678779 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678711 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config" (OuterVolumeSpecName: "console-config") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:24.678779 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678748 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:24.678873 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678803 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:24.678968 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678949 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-oauth-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.679025 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678974 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-service-ca\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.679025 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.678990 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.679025 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.679005 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b687bba-4000-4cf4-abd1-c3b4696936ad-trusted-ca-bundle\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.680612 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.680582 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:24.680704 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.680593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4" (OuterVolumeSpecName: "kube-api-access-dfww4") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "kube-api-access-dfww4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:24.680763 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.680732 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4b687bba-4000-4cf4-abd1-c3b4696936ad" (UID: "4b687bba-4000-4cf4-abd1-c3b4696936ad"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:24.779603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.779580 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfww4\" (UniqueName: \"kubernetes.io/projected/4b687bba-4000-4cf4-abd1-c3b4696936ad-kube-api-access-dfww4\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.779603 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.779601 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-oauth-config\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:24.779728 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:24.779611 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b687bba-4000-4cf4-abd1-c3b4696936ad-console-serving-cert\") on node \"ip-10-0-136-160.ec2.internal\" DevicePath \"\"" Apr 24 21:36:25.333781 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333755 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5757f86cf4-s8bwr_4b687bba-4000-4cf4-abd1-c3b4696936ad/console/0.log" Apr 24 21:36:25.334142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333802 2575 generic.go:358] "Generic (PLEG): container finished" podID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerID="d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330" exitCode=2 Apr 24 21:36:25.334142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333880 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5757f86cf4-s8bwr" Apr 24 21:36:25.334142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333890 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5757f86cf4-s8bwr" event={"ID":"4b687bba-4000-4cf4-abd1-c3b4696936ad","Type":"ContainerDied","Data":"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330"} Apr 24 21:36:25.334142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5757f86cf4-s8bwr" event={"ID":"4b687bba-4000-4cf4-abd1-c3b4696936ad","Type":"ContainerDied","Data":"dda2442d518806b1ea924a0913c28355b8538f8479157488c4d8f00899571222"} Apr 24 21:36:25.334142 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.333948 2575 scope.go:117] "RemoveContainer" containerID="d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330" Apr 24 21:36:25.343441 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.343402 2575 scope.go:117] "RemoveContainer" containerID="d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330" Apr 24 21:36:25.343745 ip-10-0-136-160 kubenswrapper[2575]: E0424 21:36:25.343723 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330\": container with ID starting with d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330 not found: ID does not exist" containerID="d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330" Apr 24 21:36:25.343820 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.343750 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330"} err="failed to get container status \"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330\": rpc error: code = NotFound desc = could not find container \"d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330\": container with ID starting with d949184fe3841b8c8f8d797801d71d52b72aeba07ac7891969ac47ecdb4d5330 not found: ID does not exist" Apr 24 21:36:25.359341 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.359314 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:36:25.363111 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:25.363091 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5757f86cf4-s8bwr"] Apr 24 21:36:26.583533 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:36:26.583495 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" path="/var/lib/kubelet/pods/4b687bba-4000-4cf4-abd1-c3b4696936ad/volumes" Apr 24 21:37:38.758566 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:37:38.758465 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:37:38.761755 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:37:38.760235 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:42:38.785722 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:42:38.785605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:42:38.787843 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:42:38.787821 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:47:38.809600 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:47:38.809504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:47:38.812870 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:47:38.812848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:52:38.834977 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:52:38.834867 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:52:38.840312 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:52:38.840291 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:57:38.867316 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:57:38.867194 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 21:57:38.877129 ip-10-0-136-160 kubenswrapper[2575]: I0424 21:57:38.877097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:02:38.901498 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:02:38.901374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:02:38.905917 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:02:38.905893 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:07:38.930651 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:07:38.930545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:07:38.942093 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:07:38.942074 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:12:38.964095 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:12:38.964056 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:12:38.970737 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:12:38.970714 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:17:38.995939 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:17:38.995827 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:17:39.002246 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:17:39.002222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:22:39.022889 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:22:39.022766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:22:39.029276 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:22:39.029256 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:27:39.049785 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:27:39.049666 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:27:39.056771 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:27:39.055582 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:31:14.009975 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:14.009944 2575 ???:1] "http: TLS handshake error from 10.0.129.230:52896: EOF" Apr 24 22:31:14.017189 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:14.017162 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f645d_f92f23e3-4a12-4a9e-b80d-b1da1f6662c0/global-pull-secret-syncer/0.log" Apr 24 22:31:14.102785 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:14.102765 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-58dwf_19fe13f9-cfb4-4b0f-8c65-000ccc157cbb/konnectivity-agent/0.log" Apr 24 22:31:14.255130 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:14.255109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-160.ec2.internal_0eb3818d29d9daefc24a29562dc700e4/haproxy/0.log" Apr 24 22:31:17.739488 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.739450 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/alertmanager/0.log" Apr 24 22:31:17.763248 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.763224 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/config-reloader/0.log" Apr 24 22:31:17.794982 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.794953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/kube-rbac-proxy-web/0.log" Apr 24 22:31:17.817896 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.817865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/kube-rbac-proxy/0.log" Apr 24 22:31:17.839763 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.839739 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/kube-rbac-proxy-metric/0.log" Apr 24 22:31:17.866195 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.866177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/prom-label-proxy/0.log" Apr 24 22:31:17.889573 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.889555 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1dfa7d7-f1c8-4219-b234-5d14e7096ce6/init-config-reloader/0.log" Apr 24 22:31:17.974409 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.974392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pb4jj_713b6f30-0ef6-4532-af69-cc0928983c5b/kube-state-metrics/0.log" Apr 24 22:31:17.996477 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:17.996411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pb4jj_713b6f30-0ef6-4532-af69-cc0928983c5b/kube-rbac-proxy-main/0.log" Apr 24 22:31:18.018862 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.018846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pb4jj_713b6f30-0ef6-4532-af69-cc0928983c5b/kube-rbac-proxy-self/0.log" Apr 24 22:31:18.074148 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.074124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-5xpg7_e9c27418-28ab-495b-b376-26606025a79e/monitoring-plugin/0.log" Apr 24 22:31:18.184030 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.184013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg89r_b62e611a-7e82-44ee-b32b-a1c65c0e67f3/node-exporter/0.log" Apr 24 22:31:18.207093 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.207077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg89r_b62e611a-7e82-44ee-b32b-a1c65c0e67f3/kube-rbac-proxy/0.log" Apr 24 22:31:18.232023 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.231999 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg89r_b62e611a-7e82-44ee-b32b-a1c65c0e67f3/init-textfile/0.log" Apr 24 22:31:18.339573 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.339511 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lx2hf_253e4ec4-590d-47fb-8e5f-d260cbf867f8/kube-rbac-proxy-main/0.log" Apr 24 22:31:18.361852 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.361830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lx2hf_253e4ec4-590d-47fb-8e5f-d260cbf867f8/kube-rbac-proxy-self/0.log" Apr 24 22:31:18.386647 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:18.386627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lx2hf_253e4ec4-590d-47fb-8e5f-d260cbf867f8/openshift-state-metrics/0.log" Apr 24 22:31:20.769345 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:20.769311 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75b4b56f7b-pb7bx_8b74e0af-6580-4a3f-8ecf-c616e5ba0f0a/console/0.log" Apr 24 22:31:21.165358 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.165289 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm"] Apr 24 22:31:21.165651 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.165637 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerName="console" Apr 24 22:31:21.165697 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.165652 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerName="console" Apr 24 22:31:21.165730 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.165723 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b687bba-4000-4cf4-abd1-c3b4696936ad" containerName="console" Apr 24 22:31:21.168931 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.168912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.171780 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.171758 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"kube-root-ca.crt\"" Apr 24 22:31:21.172924 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.172906 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"openshift-service-ca.crt\"" Apr 24 22:31:21.173037 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.172928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-scnl7\"/\"default-dockercfg-vdl5x\"" Apr 24 22:31:21.179787 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.179764 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm"] Apr 24 22:31:21.234726 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.234708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-xl4xs_0c63e97e-44fc-421c-8de0-988acb06e78e/volume-data-source-validator/0.log" Apr 24 22:31:21.246460 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.246410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-lib-modules\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.246562 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.246473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ptc\" (UniqueName: \"kubernetes.io/projected/26d46392-dcd7-4481-9c65-1cff788a93c9-kube-api-access-s5ptc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.246562 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.246495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-proc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.246562 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.246512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-podres\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.246562 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.246549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-sys\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347225 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-sys\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347319 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-lib-modules\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347319 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ptc\" (UniqueName: \"kubernetes.io/projected/26d46392-dcd7-4481-9c65-1cff788a93c9-kube-api-access-s5ptc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347319 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-proc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347319 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-podres\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347511 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-sys\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347511 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-proc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347511 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-lib-modules\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.347511 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.347391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d46392-dcd7-4481-9c65-1cff788a93c9-podres\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.355181 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.355163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ptc\" (UniqueName: \"kubernetes.io/projected/26d46392-dcd7-4481-9c65-1cff788a93c9-kube-api-access-s5ptc\") pod \"perf-node-gather-daemonset-hdbpm\" (UID: \"26d46392-dcd7-4481-9c65-1cff788a93c9\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.479367 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.479346 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:21.597955 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.597932 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm"] Apr 24 22:31:21.600622 ip-10-0-136-160 kubenswrapper[2575]: W0424 22:31:21.600592 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod26d46392_dcd7_4481_9c65_1cff788a93c9.slice/crio-31f3f979566ce33e6e25afb8c6c12e284643feed7339be096ba432006af18c4f WatchSource:0}: Error finding container 31f3f979566ce33e6e25afb8c6c12e284643feed7339be096ba432006af18c4f: Status 404 returned error can't find the container with id 31f3f979566ce33e6e25afb8c6c12e284643feed7339be096ba432006af18c4f Apr 24 22:31:21.602304 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.602286 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:31:21.940239 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.940212 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8m598_4c8553a4-97bd-43aa-a9ab-7ccbb4358a98/dns/0.log" Apr 24 22:31:21.960818 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:21.960792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8m598_4c8553a4-97bd-43aa-a9ab-7ccbb4358a98/kube-rbac-proxy/0.log" Apr 24 22:31:22.008245 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.008215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" event={"ID":"26d46392-dcd7-4481-9c65-1cff788a93c9","Type":"ContainerStarted","Data":"1058190a9d9fb5402bce8914e756d187384389abc5e860105f8c9ae11da2b441"} Apr 24 22:31:22.008348 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.008250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" event={"ID":"26d46392-dcd7-4481-9c65-1cff788a93c9","Type":"ContainerStarted","Data":"31f3f979566ce33e6e25afb8c6c12e284643feed7339be096ba432006af18c4f"} Apr 24 22:31:22.008348 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.008286 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:22.024154 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.024108 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" podStartSLOduration=1.024094452 podStartE2EDuration="1.024094452s" podCreationTimestamp="2026-04-24 22:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:22.022727567 +0000 UTC m=+3823.962573299" watchObservedRunningTime="2026-04-24 22:31:22.024094452 +0000 UTC m=+3823.963940215" Apr 24 22:31:22.117901 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.117878 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vdppq_c69c2633-a089-45fd-9a6f-5b56c0d7beb1/dns-node-resolver/0.log" Apr 24 22:31:22.543021 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.542985 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d8fc7475c-md5gj_50576603-0528-4100-9713-5d6578a97229/registry/0.log" Apr 24 22:31:22.586512 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:22.586492 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flsks_4327adce-270c-40d3-b3a2-3f3c1acfa545/node-ca/0.log" Apr 24 22:31:23.650924 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:23.650891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pzzlw_51fc9513-bf57-4b5f-9a7c-f7325f046b26/serve-healthcheck-canary/0.log" Apr 24 22:31:24.023553 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:24.023522 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l5zd6_114bfe15-0df7-402e-b377-0bf72321706b/insights-operator/0.log" Apr 24 22:31:24.024402 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:24.024375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l5zd6_114bfe15-0df7-402e-b377-0bf72321706b/insights-operator/1.log" Apr 24 22:31:24.050320 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:24.050300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fcwpj_ba08d1de-7f2a-43fa-9f8c-c670824b9bdb/kube-rbac-proxy/0.log" Apr 24 22:31:24.073126 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:24.073104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fcwpj_ba08d1de-7f2a-43fa-9f8c-c670824b9bdb/exporter/0.log" Apr 24 22:31:24.098183 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:24.098162 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fcwpj_ba08d1de-7f2a-43fa-9f8c-c670824b9bdb/extractor/0.log" Apr 24 22:31:26.231935 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:26.231907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84b6647887-n6kbk_d8c0694b-1dac-4f93-b968-338affc8e878/manager/0.log" Apr 24 22:31:26.265261 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:26.265241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-tnrjt_41b38f4c-58f1-4d05-b65f-38c0b99bd1cd/manager/0.log" Apr 24 22:31:28.022904 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:28.022872 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-hdbpm" Apr 24 22:31:31.071511 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:31.071485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-szrrw_d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f/kube-storage-version-migrator-operator/1.log" Apr 24 22:31:31.072279 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:31.072254 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-szrrw_d51e6e0b-2cb3-4dcb-94ad-d4c80e19405f/kube-storage-version-migrator-operator/0.log" Apr 24 22:31:32.310274 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.310243 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:32.356160 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.356135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/egress-router-binary-copy/0.log" Apr 24 22:31:32.395120 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.395100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/cni-plugins/0.log" Apr 24 22:31:32.415201 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.415153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/bond-cni-plugin/0.log" Apr 24 22:31:32.441718 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.441702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/routeoverride-cni/0.log" Apr 24 22:31:32.463786 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.463768 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/whereabouts-cni-bincopy/0.log" Apr 24 22:31:32.486471 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.486446 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kb7pn_edd258c5-66bc-4d60-8302-4f99f9bfa7dc/whereabouts-cni/0.log" Apr 24 22:31:32.554917 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.554895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nv6gl_abbc168a-6ea4-427c-8c8d-16f6a126b2a8/kube-multus/0.log" Apr 24 22:31:32.679244 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.679195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l78bh_6ec8ce7f-d73f-4ff5-a981-9d84448a51a6/network-metrics-daemon/0.log" Apr 24 22:31:32.697677 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:32.697659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l78bh_6ec8ce7f-d73f-4ff5-a981-9d84448a51a6/kube-rbac-proxy/0.log" Apr 24 22:31:34.091565 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.091539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-controller/0.log" Apr 24 22:31:34.110748 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.110725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/0.log" Apr 24 22:31:34.126390 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.126365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovn-acl-logging/1.log" Apr 24 22:31:34.146215 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.146192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/kube-rbac-proxy-node/0.log" Apr 24 22:31:34.166033 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.166017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:34.184319 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.184296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/northd/0.log" Apr 24 22:31:34.204384 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.204368 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/nbdb/0.log" Apr 24 22:31:34.230823 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.230801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/sbdb/0.log" Apr 24 22:31:34.333323 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:34.333297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zglkz_88ca30bc-f546-43f6-8751-e5c36307eb86/ovnkube-controller/0.log" Apr 24 22:31:35.318669 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:35.318642 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-c2lfl_bdffaacb-7941-4781-a486-7ed533d52846/check-endpoints/0.log" Apr 24 22:31:35.423768 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:35.423742 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2sm45_a3e1bc5e-3bc3-4e15-a162-3e3b6e59374f/network-check-target-container/0.log" Apr 24 22:31:36.333260 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:36.333233 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6hct6_bb182b1c-327c-4054-8814-10769b9fc643/iptables-alerter/0.log" Apr 24 22:31:37.029288 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:37.029240 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-l5g4l_155ba158-2f39-4023-b916-b8d0af483d46/tuned/0.log" Apr 24 22:31:39.573146 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:39.573080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-fbtc5_2e9ba745-815f-4019-a172-e88557fff65c/service-ca-operator/1.log" Apr 24 22:31:39.574300 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:39.574270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-fbtc5_2e9ba745-815f-4019-a172-e88557fff65c/service-ca-operator/0.log" Apr 24 22:31:39.868903 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:39.868836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-jtfx2_b3f7a4ce-1dd3-4768-89a3-e35106a565cf/service-ca-controller/0.log" Apr 24 22:31:40.225057 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:40.225033 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gbl5r_01b5838c-cf53-4f25-8edb-f0bb7176b567/csi-driver/0.log" Apr 24 22:31:40.244619 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:40.244596 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gbl5r_01b5838c-cf53-4f25-8edb-f0bb7176b567/csi-node-driver-registrar/0.log" Apr 24 22:31:40.264433 ip-10-0-136-160 kubenswrapper[2575]: I0424 22:31:40.264406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gbl5r_01b5838c-cf53-4f25-8edb-f0bb7176b567/csi-liveness-probe/0.log"