Apr 20 11:39:24.807098 ip-10-0-141-26 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 11:39:24.807110 ip-10-0-141-26 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 11:39:24.807117 ip-10-0-141-26 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 11:39:24.807361 ip-10-0-141-26 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 11:39:34.932078 ip-10-0-141-26 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 11:39:34.932094 ip-10-0-141-26 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8a15e2311315449dae8ee4aeffddd0d3 -- Apr 20 11:41:52.414975 ip-10-0-141-26 systemd[1]: Starting Kubernetes Kubelet... Apr 20 11:41:52.827229 ip-10-0-141-26 kubenswrapper[2585]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:52.827229 ip-10-0-141-26 kubenswrapper[2585]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 11:41:52.827229 ip-10-0-141-26 kubenswrapper[2585]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:52.827229 ip-10-0-141-26 kubenswrapper[2585]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 11:41:52.827229 ip-10-0-141-26 kubenswrapper[2585]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:52.830572 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.830477 2585 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 11:41:52.833599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833584 2585 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:52.833599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833599 2585 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833603 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833606 2585 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833608 2585 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833614 2585 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833619 2585 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833622 2585 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833625 2585 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833628 2585 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833631 2585 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833634 2585 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833637 2585 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833640 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833642 2585 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833645 2585 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833647 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833650 2585 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833652 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833655 2585 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:52.833660 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833658 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833661 2585 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833663 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833666 2585 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833669 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833672 2585 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833675 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833677 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833679 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833682 2585 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833685 2585 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833701 2585 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833704 2585 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833707 2585 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833710 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833713 2585 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833716 2585 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833719 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833721 2585 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833723 2585 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:52.834114 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833726 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833730 2585 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833733 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833735 2585 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833738 2585 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833740 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833742 2585 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833745 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833747 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833750 2585 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833752 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833755 2585 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833757 2585 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833760 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833763 2585 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833766 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833769 2585 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833771 2585 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833774 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833777 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:52.834591 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833780 2585 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833783 2585 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833785 2585 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833788 2585 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833792 2585 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833794 2585 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833798 2585 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833802 2585 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833805 2585 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833807 2585 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833810 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833812 2585 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833815 2585 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833817 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833820 2585 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833822 2585 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833825 2585 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833827 2585 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833830 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:52.835083 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833832 2585 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833835 2585 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833837 2585 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833840 2585 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833842 2585 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833845 2585 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:52.835541 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.833848 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:52.836011 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.835999 2585 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:52.836011 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836011 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836016 2585 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836020 2585 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836023 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836026 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836029 2585 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836031 2585 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836034 2585 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836036 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836039 2585 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836041 2585 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836044 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836046 2585 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836049 2585 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836056 2585 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836059 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836061 2585 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836064 2585 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836067 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836069 2585 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:52.836073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836071 2585 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836074 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836077 2585 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836080 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836082 2585 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836085 2585 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836088 2585 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836090 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836093 2585 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836095 2585 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836098 2585 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836100 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836103 2585 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836105 2585 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836108 2585 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836110 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836113 2585 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836115 2585 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836119 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836121 2585 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:52.836607 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836124 2585 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836127 2585 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836131 2585 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836134 2585 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836136 2585 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836139 2585 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836142 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836145 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836148 2585 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836151 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836153 2585 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836156 2585 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836158 2585 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836160 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836164 2585 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836166 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836169 2585 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836172 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836174 2585 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:52.837149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836176 2585 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836179 2585 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836181 2585 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836184 2585 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836186 2585 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836189 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836191 2585 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836193 2585 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836196 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836198 2585 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836200 2585 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836203 2585 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836205 2585 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836207 2585 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836210 2585 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836212 2585 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836215 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836217 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836220 2585 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836222 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:52.837610 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836238 2585 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836241 2585 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836244 2585 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836246 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836249 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.836251 2585 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836320 2585 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836327 2585 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836334 2585 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836341 2585 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836348 2585 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836352 2585 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836359 2585 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836365 2585 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836368 2585 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836371 2585 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836374 2585 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836378 2585 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836381 2585 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836384 2585 flags.go:64] FLAG: --cgroup-root="" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836387 2585 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836389 2585 flags.go:64] FLAG: --client-ca-file="" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836392 2585 flags.go:64] FLAG: --cloud-config="" Apr 20 11:41:52.838152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836395 2585 flags.go:64] FLAG: --cloud-provider="external" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836398 2585 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836402 2585 flags.go:64] FLAG: --cluster-domain="" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836404 2585 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836407 2585 flags.go:64] FLAG: --config-dir="" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836410 2585 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836413 2585 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836417 2585 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836420 2585 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836423 2585 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836427 2585 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836430 2585 flags.go:64] FLAG: --contention-profiling="false" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836433 2585 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836436 2585 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836439 2585 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836442 2585 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836450 2585 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836453 2585 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836456 2585 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836459 2585 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836462 2585 flags.go:64] FLAG: --enable-server="true" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836464 2585 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836472 2585 flags.go:64] FLAG: --event-burst="100" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836475 2585 flags.go:64] FLAG: --event-qps="50" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836478 2585 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 11:41:52.838723 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836481 2585 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836484 2585 flags.go:64] FLAG: --eviction-hard="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836488 2585 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836491 2585 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836493 2585 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836496 2585 flags.go:64] FLAG: --eviction-soft="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836500 2585 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836502 2585 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836505 2585 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836508 2585 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836510 2585 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836513 2585 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836516 2585 flags.go:64] FLAG: --feature-gates="" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836519 2585 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836522 2585 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836525 2585 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836528 2585 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836532 2585 flags.go:64] FLAG: --healthz-port="10248" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836553 2585 flags.go:64] FLAG: --help="false" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836557 2585 flags.go:64] FLAG: --hostname-override="ip-10-0-141-26.ec2.internal" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836560 2585 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836563 2585 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836568 2585 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836572 2585 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 11:41:52.839367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836576 2585 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836579 2585 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836581 2585 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836584 2585 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836587 2585 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836590 2585 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836593 2585 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836596 2585 flags.go:64] FLAG: --kube-reserved="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836599 2585 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836601 2585 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836604 2585 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836607 2585 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836610 2585 flags.go:64] FLAG: --lock-file="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836612 2585 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836616 2585 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836619 2585 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836624 2585 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836627 2585 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836630 2585 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836633 2585 flags.go:64] FLAG: --logging-format="text" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836635 2585 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836639 2585 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836642 2585 flags.go:64] FLAG: --manifest-url="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836644 2585 flags.go:64] FLAG: --manifest-url-header="" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836649 2585 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 11:41:52.839961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836652 2585 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836656 2585 flags.go:64] FLAG: --max-pods="110" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836664 2585 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836667 2585 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836670 2585 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836674 2585 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836677 2585 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836680 2585 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836683 2585 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836705 2585 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836708 2585 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836711 2585 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836714 2585 flags.go:64] FLAG: --pod-cidr="" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836718 2585 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836726 2585 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836729 2585 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836732 2585 flags.go:64] FLAG: --pods-per-core="0" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836735 2585 flags.go:64] FLAG: --port="10250" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836738 2585 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836741 2585 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f10f8e570e5154e7" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836744 2585 flags.go:64] FLAG: --qos-reserved="" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836747 2585 flags.go:64] FLAG: --read-only-port="10255" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836750 2585 flags.go:64] FLAG: --register-node="true" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836753 2585 flags.go:64] FLAG: --register-schedulable="true" Apr 20 11:41:52.840556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836756 2585 flags.go:64] FLAG: --register-with-taints="" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836759 2585 flags.go:64] FLAG: --registry-burst="10" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836762 2585 flags.go:64] FLAG: --registry-qps="5" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836766 2585 flags.go:64] FLAG: --reserved-cpus="" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836768 2585 flags.go:64] FLAG: --reserved-memory="" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836772 2585 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836775 2585 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836778 2585 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836781 2585 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836783 2585 flags.go:64] FLAG: --runonce="false" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836786 2585 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836795 2585 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836798 2585 flags.go:64] FLAG: --seccomp-default="false" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836801 2585 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836804 2585 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836807 2585 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836810 2585 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836813 2585 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836816 2585 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836819 2585 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836822 2585 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836830 2585 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836833 2585 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836836 2585 flags.go:64] FLAG: --system-cgroups="" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836839 2585 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 11:41:52.841164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836844 2585 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836847 2585 flags.go:64] FLAG: --tls-cert-file="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836850 2585 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836856 2585 flags.go:64] FLAG: --tls-min-version="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836859 2585 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836862 2585 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836865 2585 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836868 2585 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836870 2585 flags.go:64] FLAG: --v="2" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836875 2585 flags.go:64] FLAG: --version="false" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836878 2585 flags.go:64] FLAG: --vmodule="" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836883 2585 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.836886 2585 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837007 2585 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837011 2585 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837014 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837017 2585 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837020 2585 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837022 2585 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837031 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837034 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837036 2585 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837039 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:52.841779 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837042 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837044 2585 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837047 2585 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837050 2585 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837054 2585 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837057 2585 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837059 2585 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837062 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837064 2585 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837067 2585 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837069 2585 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837072 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837074 2585 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837076 2585 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837079 2585 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837081 2585 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837084 2585 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837103 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837107 2585 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837109 2585 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:52.842318 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837112 2585 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837115 2585 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837118 2585 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837121 2585 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837123 2585 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837126 2585 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837128 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837131 2585 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837133 2585 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837141 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837144 2585 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837147 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837149 2585 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837152 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837155 2585 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837158 2585 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837160 2585 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837163 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837166 2585 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837168 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:52.842874 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837171 2585 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837173 2585 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837176 2585 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837178 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837180 2585 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837183 2585 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837185 2585 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837188 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837190 2585 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837194 2585 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837197 2585 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837200 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837202 2585 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837205 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837207 2585 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837209 2585 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837212 2585 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837214 2585 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837217 2585 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:52.843374 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837219 2585 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837221 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837226 2585 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837235 2585 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837238 2585 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837241 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837243 2585 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837246 2585 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837248 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837251 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837253 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837256 2585 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837258 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837261 2585 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837263 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837265 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:52.843856 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.837268 2585 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:52.844256 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.838032 2585 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:52.846058 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.846034 2585 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 11:41:52.846058 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.846056 2585 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846109 2585 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846115 2585 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846119 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846122 2585 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846125 2585 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846127 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846130 2585 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846133 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846135 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846138 2585 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846140 2585 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846142 2585 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846145 2585 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846148 2585 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846150 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846153 2585 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846155 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846158 2585 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846160 2585 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:52.846193 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846163 2585 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846166 2585 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846169 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846171 2585 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846173 2585 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846176 2585 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846179 2585 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846181 2585 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846185 2585 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846188 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846191 2585 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846194 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846196 2585 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846199 2585 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846202 2585 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846205 2585 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846208 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846210 2585 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846213 2585 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:52.846676 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846215 2585 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846217 2585 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846220 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846223 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846225 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846229 2585 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846233 2585 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846236 2585 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846240 2585 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846242 2585 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846245 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846248 2585 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846251 2585 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846253 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846256 2585 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846259 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846262 2585 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846264 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846267 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846269 2585 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:52.847192 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846272 2585 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846275 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846277 2585 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846280 2585 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846282 2585 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846285 2585 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846287 2585 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846290 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846292 2585 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846295 2585 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846297 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846299 2585 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846302 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846304 2585 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846307 2585 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846309 2585 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846311 2585 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846314 2585 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846317 2585 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846320 2585 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:52.847685 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846322 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846325 2585 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846327 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846329 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846332 2585 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846334 2585 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846336 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846340 2585 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.846345 2585 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846437 2585 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846443 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846446 2585 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846449 2585 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846452 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846455 2585 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846458 2585 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:52.848187 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846461 2585 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846464 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846467 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846469 2585 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846472 2585 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846475 2585 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846478 2585 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846481 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846483 2585 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846486 2585 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846488 2585 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846490 2585 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846493 2585 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846495 2585 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846498 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846501 2585 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846504 2585 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846506 2585 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846509 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846512 2585 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:52.848599 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846514 2585 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846516 2585 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846519 2585 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846521 2585 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846526 2585 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846530 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846532 2585 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846535 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846537 2585 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846540 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846542 2585 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846545 2585 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846547 2585 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846550 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846552 2585 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846554 2585 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846557 2585 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846559 2585 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846562 2585 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:52.849108 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846564 2585 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846566 2585 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846569 2585 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846571 2585 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846574 2585 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846576 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846578 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846581 2585 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846583 2585 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846586 2585 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846589 2585 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846591 2585 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846593 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846596 2585 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846598 2585 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846601 2585 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846603 2585 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846606 2585 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846609 2585 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846612 2585 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:52.849566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846614 2585 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846616 2585 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846619 2585 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846621 2585 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846624 2585 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846626 2585 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846628 2585 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846631 2585 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846633 2585 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846635 2585 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846638 2585 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846640 2585 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846643 2585 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846645 2585 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846647 2585 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846650 2585 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846652 2585 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846654 2585 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846658 2585 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:52.850149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:52.846661 2585 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:52.850635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.846667 2585 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:52.850635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.847348 2585 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 11:41:52.850635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.849851 2585 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 11:41:52.850783 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.850771 2585 server.go:1019] "Starting client certificate rotation" Apr 20 11:41:52.850885 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.850867 2585 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:41:52.850920 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.850913 2585 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:41:52.876383 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.876364 2585 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:41:52.879136 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.879110 2585 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:41:52.895396 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.895377 2585 log.go:25] "Validated CRI v1 runtime API" Apr 20 11:41:52.901064 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.901048 2585 log.go:25] "Validated CRI v1 image API" Apr 20 11:41:52.903160 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.903144 2585 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 11:41:52.907502 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.907470 2585 fs.go:135] Filesystem UUIDs: map[65bad73f-bd62-4c86-91f3-3721735914c2:/dev/nvme0n1p4 72bc3416-1bc9-4711-8981-30a5ef320308:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 11:41:52.907595 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.907507 2585 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 11:41:52.908264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.908236 2585 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:41:52.913451 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.913347 2585 manager.go:217] Machine: {Timestamp:2026-04-20 11:41:52.911811648 +0000 UTC m=+0.384912725 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3073930 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec279c371cc60fc2ce4e4e252db45834 SystemUUID:ec279c37-1cc6-0fc2-ce4e-4e252db45834 BootID:8a15e231-1315-449d-ae8e-e4aeffddd0d3 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:dc:32:2a:6f:25 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:dc:32:2a:6f:25 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:84:5d:55:7d:8e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 11:41:52.913451 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.913443 2585 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 11:41:52.913575 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.913552 2585 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 11:41:52.914453 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.914430 2585 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 11:41:52.914585 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.914457 2585 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-26.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 11:41:52.914636 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.914594 2585 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 11:41:52.914636 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.914602 2585 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 11:41:52.914636 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.914617 2585 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:41:52.915399 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.915389 2585 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:41:52.916152 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.916142 2585 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:41:52.916278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.916269 2585 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 11:41:52.918234 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.918222 2585 kubelet.go:491] "Attempting to sync node with API server" Apr 20 11:41:52.918278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.918238 2585 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 11:41:52.918278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.918249 2585 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 11:41:52.918278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.918258 2585 kubelet.go:397] "Adding apiserver pod source" Apr 20 11:41:52.918278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.918267 2585 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 11:41:52.919264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.919251 2585 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:41:52.919309 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.919274 2585 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:41:52.922274 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.922256 2585 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 11:41:52.923892 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.923879 2585 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 11:41:52.925078 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925054 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 11:41:52.925078 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925076 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925085 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925091 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925097 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925102 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925108 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925113 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925120 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925127 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925140 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 11:41:52.925163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925149 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 11:41:52.925905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925896 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 11:41:52.925945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.925906 2585 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 11:41:52.929527 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.929390 2585 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 11:41:52.929590 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.929552 2585 server.go:1295] "Started kubelet" Apr 20 11:41:52.929676 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.929626 2585 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 11:41:52.929783 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.929731 2585 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 11:41:52.929819 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.929812 2585 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 11:41:52.930183 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.930166 2585 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-26.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 11:41:52.930234 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.930185 2585 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-26.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 11:41:52.930305 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.930288 2585 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 11:41:52.930523 ip-10-0-141-26 systemd[1]: Started Kubernetes Kubelet. Apr 20 11:41:52.931204 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.931167 2585 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 11:41:52.932559 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.932527 2585 server.go:317] "Adding debug handlers to kubelet server" Apr 20 11:41:52.936183 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.936167 2585 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 11:41:52.936885 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.936867 2585 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 11:41:52.937608 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.937592 2585 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 11:41:52.937799 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.937738 2585 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 11:41:52.938944 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.938930 2585 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 11:41:52.939040 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.938722 2585 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 11:41:52.939131 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939119 2585 factory.go:55] Registering systemd factory Apr 20 11:41:52.939214 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939195 2585 factory.go:223] Registration of the systemd container factory successfully Apr 20 11:41:52.939299 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.938205 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:52.939299 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939062 2585 reconstruct.go:97] "Volume reconstruction finished" Apr 20 11:41:52.939299 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939282 2585 reconciler.go:26] "Reconciler: start to sync state" Apr 20 11:41:52.939447 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939436 2585 factory.go:153] Registering CRI-O factory Apr 20 11:41:52.939485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939449 2585 factory.go:223] Registration of the crio container factory successfully Apr 20 11:41:52.939485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939470 2585 factory.go:103] Registering Raw factory Apr 20 11:41:52.939485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939482 2585 manager.go:1196] Started watching for new ooms in manager Apr 20 11:41:52.939818 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.938749 2585 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-26.ec2.internal.18a80de0fad17213 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-26.ec2.internal,UID:ip-10-0-141-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-26.ec2.internal,},FirstTimestamp:2026-04-20 11:41:52.929526291 +0000 UTC m=+0.402627368,LastTimestamp:2026-04-20 11:41:52.929526291 +0000 UTC m=+0.402627368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-26.ec2.internal,}" Apr 20 11:41:52.939891 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.939818 2585 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 11:41:52.939891 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.939885 2585 manager.go:319] Starting recovery of all containers Apr 20 11:41:52.943242 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.943212 2585 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-26.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 11:41:52.943342 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.943247 2585 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 11:41:52.950739 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.950722 2585 manager.go:324] Recovery completed Apr 20 11:41:52.952050 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.952018 2585 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 11:41:52.954981 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.954965 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:52.957230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957206 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:52.957301 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957238 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:52.957301 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957249 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:52.957732 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957717 2585 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 11:41:52.957795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957732 2585 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 11:41:52.957795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.957749 2585 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:41:52.959795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.959777 2585 policy_none.go:49] "None policy: Start" Apr 20 11:41:52.959876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.959800 2585 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 11:41:52.959876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.959816 2585 state_mem.go:35] "Initializing new in-memory state store" Apr 20 11:41:52.959942 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.959838 2585 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-26.ec2.internal.18a80de0fc78182f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-26.ec2.internal,UID:ip-10-0-141-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-26.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-26.ec2.internal,},FirstTimestamp:2026-04-20 11:41:52.957225007 +0000 UTC m=+0.430326084,LastTimestamp:2026-04-20 11:41:52.957225007 +0000 UTC m=+0.430326084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-26.ec2.internal,}" Apr 20 11:41:52.969131 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.969070 2585 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-26.ec2.internal.18a80de0fc785f65 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-26.ec2.internal,UID:ip-10-0-141-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-26.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-26.ec2.internal,},FirstTimestamp:2026-04-20 11:41:52.957243237 +0000 UTC m=+0.430344313,LastTimestamp:2026-04-20 11:41:52.957243237 +0000 UTC m=+0.430344313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-26.ec2.internal,}" Apr 20 11:41:52.973963 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.973947 2585 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6xdgs" Apr 20 11:41:52.977863 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.977802 2585 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-26.ec2.internal.18a80de0fc7886f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-26.ec2.internal,UID:ip-10-0-141-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-141-26.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-141-26.ec2.internal,},FirstTimestamp:2026-04-20 11:41:52.957253363 +0000 UTC m=+0.430354440,LastTimestamp:2026-04-20 11:41:52.957253363 +0000 UTC m=+0.430354440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-26.ec2.internal,}" Apr 20 11:41:52.981915 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.981899 2585 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6xdgs" Apr 20 11:41:52.995264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.995247 2585 manager.go:341] "Starting Device Plugin manager" Apr 20 11:41:52.995345 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.995312 2585 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 11:41:52.995345 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.995326 2585 server.go:85] "Starting device plugin registration server" Apr 20 11:41:52.995672 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.995658 2585 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 11:41:52.995746 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.995673 2585 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 11:41:52.996322 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.996056 2585 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 11:41:52.996322 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.996130 2585 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 11:41:52.996322 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:52.996137 2585 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 11:41:52.996511 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.996439 2585 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 11:41:52.996511 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:52.996480 2585 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.048511 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.048461 2585 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 11:41:53.050430 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.049818 2585 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 11:41:53.050430 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.049846 2585 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 11:41:53.050430 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.049865 2585 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 11:41:53.050430 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.049871 2585 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 11:41:53.050430 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.049905 2585 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 11:41:53.053278 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.053258 2585 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:53.096867 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.096795 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:53.097785 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.097768 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:53.097865 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.097798 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:53.097865 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.097808 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:53.097865 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.097831 2585 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.107919 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.107901 2585 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.107968 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.107925 2585 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-26.ec2.internal\": node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.121663 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.121632 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.150136 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.150107 2585 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal"] Apr 20 11:41:53.150210 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.150187 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:53.151986 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.151971 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:53.152053 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.152000 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:53.152053 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.152011 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:53.154057 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154044 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:53.154213 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154199 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.154268 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154231 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:53.154829 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154810 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:53.154945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154842 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:53.154945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154813 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:53.154945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154854 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:53.154945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154876 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:53.154945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.154892 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:53.156992 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.156976 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.157075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.157006 2585 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:53.157648 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.157635 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:53.157738 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.157663 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:53.157738 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.157676 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:53.181534 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.181506 2585 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-26.ec2.internal\" not found" node="ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.185821 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.185800 2585 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-26.ec2.internal\" not found" node="ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.222051 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.222020 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.241405 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.241368 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.241405 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.241399 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.241571 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.241417 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a9153efe987700267f82546f061485e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-26.ec2.internal\" (UID: \"1a9153efe987700267f82546f061485e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.323035 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.322989 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.342489 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342463 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.342556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342493 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.342556 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342512 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a9153efe987700267f82546f061485e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-26.ec2.internal\" (UID: \"1a9153efe987700267f82546f061485e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.342624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342564 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a9153efe987700267f82546f061485e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-26.ec2.internal\" (UID: \"1a9153efe987700267f82546f061485e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.342624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342573 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.342624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.342582 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fbd0742d2780e8ffac4af40fa72d97-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal\" (UID: \"f5fbd0742d2780e8ffac4af40fa72d97\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.423956 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.423886 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.483463 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.483433 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.488526 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.488502 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:53.524176 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.524133 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.624671 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.624616 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.725321 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.725236 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.825836 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.825804 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.850374 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.850347 2585 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 11:41:53.851022 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.850477 2585 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 11:41:53.925903 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:53.925875 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:53.938230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.938186 2585 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 11:41:53.954219 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.954198 2585 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:41:53.978652 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.978583 2585 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-stdl2" Apr 20 11:41:53.983718 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.983663 2585 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 11:36:52 +0000 UTC" deadline="2027-12-23 23:13:14.837572938 +0000 UTC" Apr 20 11:41:53.983863 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.983719 2585 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14699h31m20.85385846s" Apr 20 11:41:53.988097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:53.988080 2585 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-stdl2" Apr 20 11:41:54.026857 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:54.026828 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:54.112768 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:54.112732 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fbd0742d2780e8ffac4af40fa72d97.slice/crio-eaaca38fc83f712d31253b244db81deab9d8a4570c3e37b33264410260e3b8b1 WatchSource:0}: Error finding container eaaca38fc83f712d31253b244db81deab9d8a4570c3e37b33264410260e3b8b1: Status 404 returned error can't find the container with id eaaca38fc83f712d31253b244db81deab9d8a4570c3e37b33264410260e3b8b1 Apr 20 11:41:54.113100 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:54.113078 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9153efe987700267f82546f061485e.slice/crio-e6f0ece400078519d4ddabb9dd7f174edd9aefa2615a79c3ab0d70bdd1365f3e WatchSource:0}: Error finding container e6f0ece400078519d4ddabb9dd7f174edd9aefa2615a79c3ab0d70bdd1365f3e: Status 404 returned error can't find the container with id e6f0ece400078519d4ddabb9dd7f174edd9aefa2615a79c3ab0d70bdd1365f3e Apr 20 11:41:54.116196 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.116178 2585 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 11:41:54.126949 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:54.126929 2585 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-26.ec2.internal\" not found" Apr 20 11:41:54.139831 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.139808 2585 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:54.210461 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.210440 2585 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:54.237992 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.237921 2585 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" Apr 20 11:41:54.258004 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.257979 2585 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:41:54.259367 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.259354 2585 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" Apr 20 11:41:54.271790 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.271771 2585 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:41:54.438090 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.438058 2585 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:54.919795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.919754 2585 apiserver.go:52] "Watching apiserver" Apr 20 11:41:54.929769 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.929739 2585 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 11:41:54.930213 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.930194 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp","openshift-cluster-node-tuning-operator/tuned-hmhcw","openshift-image-registry/node-ca-kvmvq","openshift-multus/multus-4m7pn","openshift-multus/multus-additional-cni-plugins-2kbqc","openshift-multus/network-metrics-daemon-gl2dq","kube-system/global-pull-secret-syncer-48wpt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal","openshift-network-diagnostics/network-check-target-mrrxd","openshift-network-operator/iptables-alerter-gnwr5","openshift-ovn-kubernetes/ovnkube-node-l6c62","kube-system/konnectivity-agent-92zb6"] Apr 20 11:41:54.933108 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.933081 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:54.933225 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:54.933167 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:41:54.935718 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.935245 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.938134 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.938116 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.938309 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.938290 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 11:41:54.938525 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.938509 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sgnmv\"" Apr 20 11:41:54.938705 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.938678 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.942281 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.941299 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.944460 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.944350 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.944971 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.944950 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z5tr9\"" Apr 20 11:41:54.946242 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.946005 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.946242 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.946028 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.946892 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.946871 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.947096 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.947077 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:54.947211 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.946978 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:54.948275 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.948254 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 11:41:54.948374 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.948357 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.948374 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.948365 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xnlkj\"" Apr 20 11:41:54.948624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.948603 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 11:41:54.949645 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949622 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-socket-dir-parent\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.949837 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949816 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-conf-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949735 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949870 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-registration-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949896 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-modprobe-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949901 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:54.949914 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949919 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-kubernetes\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.949935 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949757 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hzhl5\"" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949958 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46wwh\" (UniqueName: \"kubernetes.io/projected/fd307f40-3318-4b65-b92c-eced354114fd-kube-api-access-46wwh\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.949921 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950001 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5fg\" (UniqueName: \"kubernetes.io/projected/bb716ff4-9386-4b54-8b88-2680a1fb36a1-kube-api-access-qt5fg\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950026 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-daemon-config\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950062 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-device-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950089 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-etc-selinux\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950108 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-sys\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950133 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cnibin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950151 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-os-release\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950164 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-netns\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950177 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-hostroot\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950196 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-lib-modules\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950219 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-etc-tuned\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950231 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ttmbb\"" Apr 20 11:41:54.950267 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950237 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb716ff4-9386-4b54-8b88-2680a1fb36a1-serviceca\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950285 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950321 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-conf\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950346 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-bin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950372 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-kubelet\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950400 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-socket-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950417 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-sys-fs\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950438 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb716ff4-9386-4b54-8b88-2680a1fb36a1-host\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950453 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-system-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950474 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-k8s-cni-cncf-io\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950504 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-multus\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950529 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950551 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-systemd\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950577 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-var-lib-kubelet\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950597 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-tmp\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950620 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghf2p\" (UniqueName: \"kubernetes.io/projected/8e0954af-e279-48c2-8485-2e1a2c5da32f-kube-api-access-ghf2p\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950643 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.950873 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950721 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cni-binary-copy\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950750 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-multus-certs\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950773 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-etc-kubernetes\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950793 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-host\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950814 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysconfig\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950838 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxvt\" (UniqueName: \"kubernetes.io/projected/3f86abc4-981a-497f-8da8-2b998417e124-kube-api-access-wjxvt\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950861 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950886 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-run\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.950911 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6c5q\" (UniqueName: \"kubernetes.io/projected/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-kube-api-access-n6c5q\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.951088 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.951133 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.951546 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.951319 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 11:41:54.954339 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.954321 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:54.956724 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.956708 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:54.956929 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.956727 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:54.957027 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.956799 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.957440 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.957403 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 11:41:54.957526 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.957420 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.957754 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.957735 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gdngf\"" Apr 20 11:41:54.959124 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.959107 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:54.959213 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:54.959159 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:41:54.961398 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961377 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s669b\"" Apr 20 11:41:54.961505 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961489 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 11:41:54.961734 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961672 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gbzm5\"" Apr 20 11:41:54.961839 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961822 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 11:41:54.961896 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961772 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 11:41:54.961896 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961875 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 11:41:54.961896 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961884 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 11:41:54.962042 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961811 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 11:41:54.962042 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961712 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 11:41:54.962042 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.961832 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 11:41:54.988789 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.988758 2585 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:36:53 +0000 UTC" deadline="2028-01-17 16:58:11.964229398 +0000 UTC" Apr 20 11:41:54.988789 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:54.988787 2585 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15293h16m16.975446734s" Apr 20 11:41:55.040800 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.040775 2585 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 11:41:55.051886 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.051852 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb716ff4-9386-4b54-8b88-2680a1fb36a1-serviceca\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.052017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.051899 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-conf\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.052017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.051926 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-bin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.051949 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-kubelet\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.051974 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-socket-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052024 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb716ff4-9386-4b54-8b88-2680a1fb36a1-host\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052050 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-k8s-cni-cncf-io\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052079 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-binary-copy\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052108 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-bin\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052134 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-script-lib\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052172 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-var-lib-kubelet\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.052206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052196 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-tmp\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052220 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cni-binary-copy\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052244 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-multus-certs\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052268 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-etc-kubernetes\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052303 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-config\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052329 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysconfig\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052367 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052393 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd5w8\" (UniqueName: \"kubernetes.io/projected/2b81064b-5a70-4705-b8d7-bf578249b1ec-kube-api-access-kd5w8\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052420 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-socket-dir-parent\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052443 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-conf-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052466 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-system-cni-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052489 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-cnibin\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.052530 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052513 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-os-release\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052553 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b81064b-5a70-4705-b8d7-bf578249b1ec-host-slash\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052580 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-registration-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052609 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-kubernetes\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052636 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46wwh\" (UniqueName: \"kubernetes.io/projected/fd307f40-3318-4b65-b92c-eced354114fd-kube-api-access-46wwh\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052661 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-daemon-config\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052706 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052740 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-kubelet\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052766 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-device-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052795 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-etc-selinux\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052819 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-sys\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052861 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cnibin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052887 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-netns\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052897 2585 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052928 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6zz\" (UniqueName: \"kubernetes.io/projected/7648da60-df45-45eb-92a1-f0b097849361-kube-api-access-nt6zz\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052969 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-var-lib-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.052993 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a0e8a26-b424-497e-b30b-d497b9949b05-ovn-node-metrics-cert\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053067 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053014 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-lib-modules\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053036 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-etc-tuned\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053079 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053107 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-netns\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053133 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-systemd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053157 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053173 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-node-log\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053188 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053210 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-sys-fs\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053225 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-system-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053241 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-multus\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053259 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b81064b-5a70-4705-b8d7-bf578249b1ec-iptables-alerter-script\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053274 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-slash\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053299 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053314 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-env-overrides\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053336 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-agent-certs\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053359 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.053876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053375 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-systemd\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053393 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053409 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-netd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053424 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck2m9\" (UniqueName: \"kubernetes.io/projected/9a0e8a26-b424-497e-b30b-d497b9949b05-kube-api-access-ck2m9\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053443 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghf2p\" (UniqueName: \"kubernetes.io/projected/8e0954af-e279-48c2-8485-2e1a2c5da32f-kube-api-access-ghf2p\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053463 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053478 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-log-socket\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053496 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-kubelet-config\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053512 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053528 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-host\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053588 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxvt\" (UniqueName: \"kubernetes.io/projected/3f86abc4-981a-497f-8da8-2b998417e124-kube-api-access-wjxvt\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053615 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053647 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-run\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053679 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6c5q\" (UniqueName: \"kubernetes.io/projected/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-kube-api-access-n6c5q\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053724 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-systemd-units\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053731 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-multus-certs\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053750 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:55.054764 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053774 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-ovn\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053781 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-conf\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053803 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-k8s-cni-cncf-io\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053814 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-modprobe-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.053994 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054063 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-host\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054192 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5fg\" (UniqueName: \"kubernetes.io/projected/bb716ff4-9386-4b54-8b88-2680a1fb36a1-kube-api-access-qt5fg\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054231 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-etc-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054263 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-os-release\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054271 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054276 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb716ff4-9386-4b54-8b88-2680a1fb36a1-serviceca\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054290 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-hostroot\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054332 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-konnectivity-ca\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054338 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-bin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054371 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-kubelet\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054383 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cni-binary-copy\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054407 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-sys-fs\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054414 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-lib-modules\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.055411 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.054452 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054457 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-run\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054463 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-socket-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054468 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-os-release\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054509 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-etc-kubernetes\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054506 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-registration-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054522 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysconfig\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054562 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-hostroot\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054572 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-var-lib-kubelet\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054604 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-dbus\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.054628 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:55.55460133 +0000 UTC m=+3.027702415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054371 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-conf-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054651 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-modprobe-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054470 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-system-cni-dir\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054684 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb716ff4-9386-4b54-8b88-2680a1fb36a1-host\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054725 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-device-dir\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054798 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-var-lib-cni-multus\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056206 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054805 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-kubernetes\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054818 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-host-run-netns\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054878 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-cnibin\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054879 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e0954af-e279-48c2-8485-2e1a2c5da32f-etc-selinux\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054903 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-socket-dir-parent\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.054928 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-sysctl-d\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.055042 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-sys\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.055083 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd307f40-3318-4b65-b92c-eced354114fd-etc-systemd\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.055560 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-multus-daemon-config\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.055885 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-tmp\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.056795 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.056375 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" event={"ID":"1a9153efe987700267f82546f061485e","Type":"ContainerStarted","Data":"e6f0ece400078519d4ddabb9dd7f174edd9aefa2615a79c3ab0d70bdd1365f3e"} Apr 20 11:41:55.057410 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.057387 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" event={"ID":"f5fbd0742d2780e8ffac4af40fa72d97","Type":"ContainerStarted","Data":"eaaca38fc83f712d31253b244db81deab9d8a4570c3e37b33264410260e3b8b1"} Apr 20 11:41:55.057410 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.057392 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd307f40-3318-4b65-b92c-eced354114fd-etc-tuned\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.070431 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.070402 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxvt\" (UniqueName: \"kubernetes.io/projected/3f86abc4-981a-497f-8da8-2b998417e124-kube-api-access-wjxvt\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:55.070672 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.070571 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6c5q\" (UniqueName: \"kubernetes.io/projected/88e132ed-bc16-4c9e-a2a8-1f11c7217cd0-kube-api-access-n6c5q\") pod \"multus-4m7pn\" (UID: \"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0\") " pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.071128 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.071109 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5fg\" (UniqueName: \"kubernetes.io/projected/bb716ff4-9386-4b54-8b88-2680a1fb36a1-kube-api-access-qt5fg\") pod \"node-ca-kvmvq\" (UID: \"bb716ff4-9386-4b54-8b88-2680a1fb36a1\") " pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.072238 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.072216 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46wwh\" (UniqueName: \"kubernetes.io/projected/fd307f40-3318-4b65-b92c-eced354114fd-kube-api-access-46wwh\") pod \"tuned-hmhcw\" (UID: \"fd307f40-3318-4b65-b92c-eced354114fd\") " pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.074557 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.074528 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghf2p\" (UniqueName: \"kubernetes.io/projected/8e0954af-e279-48c2-8485-2e1a2c5da32f-kube-api-access-ghf2p\") pod \"aws-ebs-csi-driver-node-wt4qp\" (UID: \"8e0954af-e279-48c2-8485-2e1a2c5da32f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.155391 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155185 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-os-release\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155409 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b81064b-5a70-4705-b8d7-bf578249b1ec-host-slash\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155438 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155459 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-kubelet\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155311 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-os-release\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155484 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6zz\" (UniqueName: \"kubernetes.io/projected/7648da60-df45-45eb-92a1-f0b097849361-kube-api-access-nt6zz\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155524 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-kubelet\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155544 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b81064b-5a70-4705-b8d7-bf578249b1ec-host-slash\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155585 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-var-lib-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155614 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a0e8a26-b424-497e-b30b-d497b9949b05-ovn-node-metrics-cert\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155644 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-var-lib-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155679 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-netns\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155647 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-netns\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155751 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-systemd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155794 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155811 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-node-log\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155813 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-systemd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155827 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155850 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b81064b-5a70-4705-b8d7-bf578249b1ec-iptables-alerter-script\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155864 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-slash\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155862 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155886 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155902 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-node-log\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155918 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-env-overrides\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.155948 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155931 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155941 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-agent-certs\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155944 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.155942 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-slash\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156063 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156068 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156095 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-netd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156122 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck2m9\" (UniqueName: \"kubernetes.io/projected/9a0e8a26-b424-497e-b30b-d497b9949b05-kube-api-access-ck2m9\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156149 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-log-socket\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156150 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-netd\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156173 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-kubelet-config\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156197 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156217 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-log-socket\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156232 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156296 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-systemd-units\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156323 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156331 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-kubelet-config\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.156670 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156348 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-ovn\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156370 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-systemd-units\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156374 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-etc-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156403 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-env-overrides\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156405 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b81064b-5a70-4705-b8d7-bf578249b1ec-iptables-alerter-script\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156410 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-konnectivity-ca\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.156447 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156451 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-run-ovn\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.156465 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-dbus\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.156990 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:55.656963635 +0000 UTC m=+3.130064713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157130 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-binary-copy\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157224 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-bin\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157265 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-script-lib\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157301 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-config\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157341 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.157404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157378 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd5w8\" (UniqueName: \"kubernetes.io/projected/2b81064b-5a70-4705-b8d7-bf578249b1ec-kube-api-access-kd5w8\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.158097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157414 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-system-cni-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.158097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157446 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-cnibin\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.158097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157535 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-cnibin\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.158097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.157759 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-host-cni-bin\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.158353 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.158333 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-script-lib\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.158775 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.158754 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a0e8a26-b424-497e-b30b-d497b9949b05-ovnkube-config\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.159153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.159132 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.159444 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.159425 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7648da60-df45-45eb-92a1-f0b097849361-system-cni-dir\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.159522 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.159454 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a0e8a26-b424-497e-b30b-d497b9949b05-etc-openvswitch\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.160236 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.159789 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-konnectivity-ca\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.160236 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.159943 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/07c8c3fc-2976-4ee9-904f-92f6a777b537-dbus\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.160236 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.160038 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a0e8a26-b424-497e-b30b-d497b9949b05-ovn-node-metrics-cert\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.160236 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.160207 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5e22d374-f2bf-4a0f-8b5f-a2396ea20c95-agent-certs\") pod \"konnectivity-agent-92zb6\" (UID: \"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95\") " pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.160236 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.160209 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7648da60-df45-45eb-92a1-f0b097849361-cni-binary-copy\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.164185 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.164166 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:55.164185 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.164185 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:55.164338 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.164195 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:55.164338 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.164249 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:41:55.66423382 +0000 UTC m=+3.137334890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:55.165905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.165857 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6zz\" (UniqueName: \"kubernetes.io/projected/7648da60-df45-45eb-92a1-f0b097849361-kube-api-access-nt6zz\") pod \"multus-additional-cni-plugins-2kbqc\" (UID: \"7648da60-df45-45eb-92a1-f0b097849361\") " pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.165978 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.165886 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck2m9\" (UniqueName: \"kubernetes.io/projected/9a0e8a26-b424-497e-b30b-d497b9949b05-kube-api-access-ck2m9\") pod \"ovnkube-node-l6c62\" (UID: \"9a0e8a26-b424-497e-b30b-d497b9949b05\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.167303 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.167283 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd5w8\" (UniqueName: \"kubernetes.io/projected/2b81064b-5a70-4705-b8d7-bf578249b1ec-kube-api-access-kd5w8\") pod \"iptables-alerter-gnwr5\" (UID: \"2b81064b-5a70-4705-b8d7-bf578249b1ec\") " pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.198310 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.198235 2585 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:55.248072 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.248043 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" Apr 20 11:41:55.256776 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.256754 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" Apr 20 11:41:55.270531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.270506 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4m7pn" Apr 20 11:41:55.278144 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.278122 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" Apr 20 11:41:55.285756 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.285734 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kvmvq" Apr 20 11:41:55.293327 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.293304 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gnwr5" Apr 20 11:41:55.302991 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.302971 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:41:55.308639 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.308618 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:41:55.561160 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.561037 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:55.561314 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.561211 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:55.561314 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.561280 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:56.561260886 +0000 UTC m=+4.034361956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:55.662193 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.662163 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:55.662337 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.662274 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:55.662337 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.662321 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:56.662308478 +0000 UTC m=+4.135409542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:55.762951 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.762908 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:55.763109 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.763071 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:55.763109 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.763102 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:55.763189 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.763116 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:55.763237 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:55.763192 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:41:56.763160903 +0000 UTC m=+4.236261981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:55.891224 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.891176 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e22d374_f2bf_4a0f_8b5f_a2396ea20c95.slice/crio-5e9a6f2c4e66291309477e3f21d4e390d2e024e4b642896035487eebdfcdd555 WatchSource:0}: Error finding container 5e9a6f2c4e66291309477e3f21d4e390d2e024e4b642896035487eebdfcdd555: Status 404 returned error can't find the container with id 5e9a6f2c4e66291309477e3f21d4e390d2e024e4b642896035487eebdfcdd555 Apr 20 11:41:55.895111 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.895026 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b81064b_5a70_4705_b8d7_bf578249b1ec.slice/crio-f2f37b11af0f3a132dba42861485859b5d6b774ac9dfcff54283a46b9f627a12 WatchSource:0}: Error finding container f2f37b11af0f3a132dba42861485859b5d6b774ac9dfcff54283a46b9f627a12: Status 404 returned error can't find the container with id f2f37b11af0f3a132dba42861485859b5d6b774ac9dfcff54283a46b9f627a12 Apr 20 11:41:55.896369 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.896340 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e132ed_bc16_4c9e_a2a8_1f11c7217cd0.slice/crio-c7f397fdee9dd4e05508bcc145ebb2daf1531ad0135a9f29dcb00c9b58abe992 WatchSource:0}: Error finding container c7f397fdee9dd4e05508bcc145ebb2daf1531ad0135a9f29dcb00c9b58abe992: Status 404 returned error can't find the container with id c7f397fdee9dd4e05508bcc145ebb2daf1531ad0135a9f29dcb00c9b58abe992 Apr 20 11:41:55.916932 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.916904 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7648da60_df45_45eb_92a1_f0b097849361.slice/crio-9814c4d630be9b58ef77261fea2ba1ea97617dc7b8ccf842bb563e7603397e9c WatchSource:0}: Error finding container 9814c4d630be9b58ef77261fea2ba1ea97617dc7b8ccf842bb563e7603397e9c: Status 404 returned error can't find the container with id 9814c4d630be9b58ef77261fea2ba1ea97617dc7b8ccf842bb563e7603397e9c Apr 20 11:41:55.917566 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.917548 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0e8a26_b424_497e_b30b_d497b9949b05.slice/crio-38f077947e8d66149241e26d1ba2c0e21e08d69290f4b1fdfaa1ead763a3538f WatchSource:0}: Error finding container 38f077947e8d66149241e26d1ba2c0e21e08d69290f4b1fdfaa1ead763a3538f: Status 404 returned error can't find the container with id 38f077947e8d66149241e26d1ba2c0e21e08d69290f4b1fdfaa1ead763a3538f Apr 20 11:41:55.918484 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.918440 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd307f40_3318_4b65_b92c_eced354114fd.slice/crio-59fedf2e6300efa220c52b3ad965c3ac3d9fe6f1c520d58320c0762f1d6a5374 WatchSource:0}: Error finding container 59fedf2e6300efa220c52b3ad965c3ac3d9fe6f1c520d58320c0762f1d6a5374: Status 404 returned error can't find the container with id 59fedf2e6300efa220c52b3ad965c3ac3d9fe6f1c520d58320c0762f1d6a5374 Apr 20 11:41:55.919581 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:41:55.919499 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0954af_e279_48c2_8485_2e1a2c5da32f.slice/crio-92109c997929c127a53ff7dfd6d288195f63feb566311cefb5710c0a1f950253 WatchSource:0}: Error finding container 92109c997929c127a53ff7dfd6d288195f63feb566311cefb5710c0a1f950253: Status 404 returned error can't find the container with id 92109c997929c127a53ff7dfd6d288195f63feb566311cefb5710c0a1f950253 Apr 20 11:41:55.989097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.989066 2585 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:36:53 +0000 UTC" deadline="2028-01-21 07:35:17.701845586 +0000 UTC" Apr 20 11:41:55.989097 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:55.989094 2585 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15379h53m21.712754824s" Apr 20 11:41:56.059726 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.059670 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" event={"ID":"1a9153efe987700267f82546f061485e","Type":"ContainerStarted","Data":"6c00bddedbd3f1bf6c2d4ee1f48d93fcbf1d75209b2ce71e10227fc07089636c"} Apr 20 11:41:56.060744 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.060716 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" event={"ID":"fd307f40-3318-4b65-b92c-eced354114fd","Type":"ContainerStarted","Data":"59fedf2e6300efa220c52b3ad965c3ac3d9fe6f1c520d58320c0762f1d6a5374"} Apr 20 11:41:56.061640 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.061620 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" event={"ID":"8e0954af-e279-48c2-8485-2e1a2c5da32f","Type":"ContainerStarted","Data":"92109c997929c127a53ff7dfd6d288195f63feb566311cefb5710c0a1f950253"} Apr 20 11:41:56.062576 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.062558 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerStarted","Data":"9814c4d630be9b58ef77261fea2ba1ea97617dc7b8ccf842bb563e7603397e9c"} Apr 20 11:41:56.063579 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.063557 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4m7pn" event={"ID":"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0","Type":"ContainerStarted","Data":"c7f397fdee9dd4e05508bcc145ebb2daf1531ad0135a9f29dcb00c9b58abe992"} Apr 20 11:41:56.064575 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.064555 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-92zb6" event={"ID":"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95","Type":"ContainerStarted","Data":"5e9a6f2c4e66291309477e3f21d4e390d2e024e4b642896035487eebdfcdd555"} Apr 20 11:41:56.065468 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.065450 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"38f077947e8d66149241e26d1ba2c0e21e08d69290f4b1fdfaa1ead763a3538f"} Apr 20 11:41:56.066294 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.066276 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gnwr5" event={"ID":"2b81064b-5a70-4705-b8d7-bf578249b1ec","Type":"ContainerStarted","Data":"f2f37b11af0f3a132dba42861485859b5d6b774ac9dfcff54283a46b9f627a12"} Apr 20 11:41:56.067275 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.067254 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kvmvq" event={"ID":"bb716ff4-9386-4b54-8b88-2680a1fb36a1","Type":"ContainerStarted","Data":"bea62b369c42fe63a933d31bf5ad0d1d05351f48ec3388e7b256778b35a46cc8"} Apr 20 11:41:56.570057 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.569343 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:56.570057 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.569621 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:56.570057 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.569710 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:58.569668303 +0000 UTC m=+6.042769374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:56.670816 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.670166 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:56.670816 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.670334 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:56.670816 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.670394 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:41:58.67037604 +0000 UTC m=+6.143477109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:56.771424 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:56.771357 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:56.771609 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.771541 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:56.771609 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.771561 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:56.771609 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.771574 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:56.771791 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:56.771632 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:41:58.771614638 +0000 UTC m=+6.244715719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:57.063024 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.062860 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:57.063469 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:57.063025 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:41:57.063469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.062859 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:57.063588 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:57.063555 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:41:57.065959 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.064062 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:57.065959 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:57.064206 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:41:57.092677 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.092632 2585 generic.go:358] "Generic (PLEG): container finished" podID="f5fbd0742d2780e8ffac4af40fa72d97" containerID="a16880806f539396fa6dccd00f4f07cabff4d7a993061b18130fcaffd2a7014a" exitCode=0 Apr 20 11:41:57.093518 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.093492 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" event={"ID":"f5fbd0742d2780e8ffac4af40fa72d97","Type":"ContainerDied","Data":"a16880806f539396fa6dccd00f4f07cabff4d7a993061b18130fcaffd2a7014a"} Apr 20 11:41:57.117555 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:57.117500 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-26.ec2.internal" podStartSLOduration=3.117482901 podStartE2EDuration="3.117482901s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:41:56.074548407 +0000 UTC m=+3.547649494" watchObservedRunningTime="2026-04-20 11:41:57.117482901 +0000 UTC m=+4.590583987" Apr 20 11:41:58.101547 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:58.101005 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" event={"ID":"f5fbd0742d2780e8ffac4af40fa72d97","Type":"ContainerStarted","Data":"c696cbdfd338eafcd1a64f517148a3429729d4c5e8f20456f8522e21de9847f1"} Apr 20 11:41:58.116026 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:58.115981 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-26.ec2.internal" podStartSLOduration=4.115967809 podStartE2EDuration="4.115967809s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:41:58.115414905 +0000 UTC m=+5.588515994" watchObservedRunningTime="2026-04-20 11:41:58.115967809 +0000 UTC m=+5.589068894" Apr 20 11:41:58.589319 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:58.589239 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:58.589564 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.589405 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:58.589564 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.589477 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:02.589455041 +0000 UTC m=+10.062556119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:58.690127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:58.690087 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:58.690298 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.690283 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:58.690361 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.690348 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:02.690328703 +0000 UTC m=+10.163429768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:41:58.790851 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:58.790817 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:58.791036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.790996 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:58.791036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.791013 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:58.791036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.791022 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:58.791200 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:58.791076 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:42:02.79105742 +0000 UTC m=+10.264158488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:59.050275 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:59.050161 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:41:59.050448 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:59.050296 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:41:59.050748 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:59.050728 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:41:59.050852 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:59.050830 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:41:59.050919 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:41:59.050908 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:41:59.051023 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:41:59.050992 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:01.051060 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:01.051024 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:01.051618 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:01.051159 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:01.051618 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:01.051542 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:01.051757 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:01.051646 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:01.051757 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:01.051714 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:01.051871 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:01.051787 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:02.622469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:02.622427 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:02.622887 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.622568 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:02.622887 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.622623 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:10.622609617 +0000 UTC m=+18.095710680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:02.723295 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:02.723204 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:02.723483 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.723375 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:02.723483 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.723457 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:10.723436573 +0000 UTC m=+18.196537646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:02.823861 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:02.823821 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:02.824065 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.823993 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:02.824065 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.824011 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:02.824065 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.824024 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:02.824218 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:02.824081 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:42:10.82406285 +0000 UTC m=+18.297163919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:03.051981 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:03.051903 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:03.052137 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:03.052018 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:03.052388 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:03.052371 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:03.052465 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:03.052433 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:03.052589 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:03.052576 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:03.052654 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:03.052635 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:05.050597 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:05.050549 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:05.051071 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:05.050677 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:05.051071 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:05.050723 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:05.051071 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:05.050893 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:05.051071 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:05.050925 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:05.051071 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:05.051028 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:07.053638 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:07.053601 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:07.054108 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:07.053601 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:07.054108 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:07.053760 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:07.054108 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:07.053601 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:07.054108 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:07.053804 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:07.054108 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:07.053891 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:09.053560 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:09.053527 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:09.054038 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:09.053662 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:09.054038 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:09.053711 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:09.054038 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:09.053658 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:09.054038 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:09.053792 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:09.054038 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:09.053886 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:10.681166 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:10.681122 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:10.681739 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.681305 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:10.681739 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.681433 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.681409871 +0000 UTC m=+34.154510955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:10.782147 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:10.782114 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:10.782330 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.782263 2585 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:10.782395 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.782336 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret podName:07c8c3fc-2976-4ee9-904f-92f6a777b537 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.782314147 +0000 UTC m=+34.255415218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret") pod "global-pull-secret-syncer-48wpt" (UID: "07c8c3fc-2976-4ee9-904f-92f6a777b537") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:10.883158 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:10.883115 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:10.883339 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.883238 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:10.883339 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.883256 2585 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:10.883339 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.883266 2585 projected.go:194] Error preparing data for projected volume kube-api-access-qpwgq for pod openshift-network-diagnostics/network-check-target-mrrxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:10.883339 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:10.883319 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq podName:bceb7c3e-9a84-4f27-8b25-3497e4f2353e nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.883303005 +0000 UTC m=+34.356404086 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpwgq" (UniqueName: "kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq") pod "network-check-target-mrrxd" (UID: "bceb7c3e-9a84-4f27-8b25-3497e4f2353e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:11.053009 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:11.052969 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:11.053161 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:11.052973 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:11.053161 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:11.053070 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:11.053271 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:11.053169 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:11.053271 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:11.052976 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:11.053357 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:11.053268 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:13.051171 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.050979 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:13.051928 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:13.051293 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:13.051928 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.051400 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:13.052539 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:13.052493 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:13.052638 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.052566 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:13.052855 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:13.052835 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:13.126828 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.126798 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" event={"ID":"fd307f40-3318-4b65-b92c-eced354114fd","Type":"ContainerStarted","Data":"1dcbf01956f13ef780998f3960c260cf7fc1f82d741c423d95aea3358f7d0b87"} Apr 20 11:42:13.128122 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.128093 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" event={"ID":"8e0954af-e279-48c2-8485-2e1a2c5da32f","Type":"ContainerStarted","Data":"67f2ba078e0fc04a04e84d54dfe828162f4403b140b507cd574fe741b4808dce"} Apr 20 11:42:13.129175 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.129153 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerStarted","Data":"f2e757c0077e74d3ed413b3e836a7a7d6178566261c0aabe9d016a3a38e9d69d"} Apr 20 11:42:13.130387 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.130366 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4m7pn" event={"ID":"88e132ed-bc16-4c9e-a2a8-1f11c7217cd0","Type":"ContainerStarted","Data":"9c21fb8915d3dcd76b3b5428016094e7c51307f1d2ff93d1428818c4c6e0a7e7"} Apr 20 11:42:13.131619 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.131599 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-92zb6" event={"ID":"5e22d374-f2bf-4a0f-8b5f-a2396ea20c95","Type":"ContainerStarted","Data":"164689f60ec5ed0839e19ceb4bacab146118f3b691e64ffd9c8db2fa1e8de6a3"} Apr 20 11:42:13.132851 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.132834 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:42:13.133136 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.133119 2585 generic.go:358] "Generic (PLEG): container finished" podID="9a0e8a26-b424-497e-b30b-d497b9949b05" containerID="ffb6ea7afa5d7d680ac5264db11558c4a695ff82e2547c5a50f9f343c886cb2e" exitCode=1 Apr 20 11:42:13.133179 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.133172 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerDied","Data":"ffb6ea7afa5d7d680ac5264db11558c4a695ff82e2547c5a50f9f343c886cb2e"} Apr 20 11:42:13.133215 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.133187 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"c6bcb4a7dbecbbeaa7845d90296e2e53ade16b3059ae3a0daa0ba9f753606cba"} Apr 20 11:42:13.134342 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.134324 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kvmvq" event={"ID":"bb716ff4-9386-4b54-8b88-2680a1fb36a1","Type":"ContainerStarted","Data":"3cc45a20b6bae6a2d2f0e66a2b2a18b7df74c560a8ed04b323de39cbff93b3e6"} Apr 20 11:42:13.162343 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.162276 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hmhcw" podStartSLOduration=3.3693488560000002 podStartE2EDuration="20.162256866s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.922452319 +0000 UTC m=+3.395553389" lastFinishedPulling="2026-04-20 11:42:12.715360322 +0000 UTC m=+20.188461399" observedRunningTime="2026-04-20 11:42:13.141814038 +0000 UTC m=+20.614915123" watchObservedRunningTime="2026-04-20 11:42:13.162256866 +0000 UTC m=+20.635357984" Apr 20 11:42:13.181853 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.181794 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-92zb6" podStartSLOduration=3.43431071 podStartE2EDuration="20.181775553s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.895745056 +0000 UTC m=+3.368846127" lastFinishedPulling="2026-04-20 11:42:12.643209903 +0000 UTC m=+20.116310970" observedRunningTime="2026-04-20 11:42:13.180864631 +0000 UTC m=+20.653965717" watchObservedRunningTime="2026-04-20 11:42:13.181775553 +0000 UTC m=+20.654876641" Apr 20 11:42:13.205538 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:13.205481 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4m7pn" podStartSLOduration=3.400269516 podStartE2EDuration="20.205461378s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.91580706 +0000 UTC m=+3.388908124" lastFinishedPulling="2026-04-20 11:42:12.720998918 +0000 UTC m=+20.194099986" observedRunningTime="2026-04-20 11:42:13.205137839 +0000 UTC m=+20.678238926" watchObservedRunningTime="2026-04-20 11:42:13.205461378 +0000 UTC m=+20.678562465" Apr 20 11:42:14.137624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.137593 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="f2e757c0077e74d3ed413b3e836a7a7d6178566261c0aabe9d016a3a38e9d69d" exitCode=0 Apr 20 11:42:14.138086 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.137679 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"f2e757c0077e74d3ed413b3e836a7a7d6178566261c0aabe9d016a3a38e9d69d"} Apr 20 11:42:14.140044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.140026 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:42:14.140326 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.140306 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"fcfd462510b8aba5a8cc9fc7c4f6c59c1d99912b8750f5dffd596e3300aaaafc"} Apr 20 11:42:14.140406 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.140335 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"2c6c23e1d0fbf0d59b4afb3dc198090879315f4bcd84da79ca1fed7558ec41bd"} Apr 20 11:42:14.140406 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.140348 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"c93d7a775f32eb903047d576817f1a036a1221595844ccd8ce76b5eb922351cc"} Apr 20 11:42:14.140406 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.140360 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"a7bad2cf61ac3a467785309947b113cbb50fc4f295ced360bf37c26e19a8e744"} Apr 20 11:42:14.141521 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.141499 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gnwr5" event={"ID":"2b81064b-5a70-4705-b8d7-bf578249b1ec","Type":"ContainerStarted","Data":"4e5464ef64e8fc98d1b3c28cf769c1c469950a18a50fc8a0112e4e3a4792bd1b"} Apr 20 11:42:14.171417 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.171375 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kvmvq" podStartSLOduration=8.778385857 podStartE2EDuration="21.171362367s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.89568306 +0000 UTC m=+3.368784124" lastFinishedPulling="2026-04-20 11:42:08.28865957 +0000 UTC m=+15.761760634" observedRunningTime="2026-04-20 11:42:13.217054243 +0000 UTC m=+20.690155325" watchObservedRunningTime="2026-04-20 11:42:14.171362367 +0000 UTC m=+21.644463431" Apr 20 11:42:14.415115 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.414961 2585 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 11:42:14.576194 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.576138 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gnwr5" podStartSLOduration=4.810859233 podStartE2EDuration="21.576121135s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.915864844 +0000 UTC m=+3.388965912" lastFinishedPulling="2026-04-20 11:42:12.681126737 +0000 UTC m=+20.154227814" observedRunningTime="2026-04-20 11:42:14.186400644 +0000 UTC m=+21.659501740" watchObservedRunningTime="2026-04-20 11:42:14.576121135 +0000 UTC m=+22.049222220" Apr 20 11:42:14.576866 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.576847 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-plzm6"] Apr 20 11:42:14.591782 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.591750 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.594542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.594522 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 11:42:14.594659 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.594624 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 11:42:14.594659 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.594624 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v6bf2\"" Apr 20 11:42:14.715623 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.715587 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80706795-3e55-4fb6-9b83-da08f0522340-hosts-file\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.715805 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.715645 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w4l\" (UniqueName: \"kubernetes.io/projected/80706795-3e55-4fb6-9b83-da08f0522340-kube-api-access-h2w4l\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.715805 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.715721 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80706795-3e55-4fb6-9b83-da08f0522340-tmp-dir\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.816946 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.816907 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80706795-3e55-4fb6-9b83-da08f0522340-hosts-file\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.817123 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.816973 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w4l\" (UniqueName: \"kubernetes.io/projected/80706795-3e55-4fb6-9b83-da08f0522340-kube-api-access-h2w4l\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.817123 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.817012 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80706795-3e55-4fb6-9b83-da08f0522340-tmp-dir\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.817123 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.817052 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80706795-3e55-4fb6-9b83-da08f0522340-hosts-file\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.817641 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.817612 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80706795-3e55-4fb6-9b83-da08f0522340-tmp-dir\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.827975 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.827920 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w4l\" (UniqueName: \"kubernetes.io/projected/80706795-3e55-4fb6-9b83-da08f0522340-kube-api-access-h2w4l\") pod \"node-resolver-plzm6\" (UID: \"80706795-3e55-4fb6-9b83-da08f0522340\") " pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.901056 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:14.901023 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plzm6" Apr 20 11:42:14.910225 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:14.910191 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80706795_3e55_4fb6_9b83_da08f0522340.slice/crio-9c0be0cddf57e0fdcce00f8c6024b54e0b07db905f5c34760e6cdd7eb93ef9f9 WatchSource:0}: Error finding container 9c0be0cddf57e0fdcce00f8c6024b54e0b07db905f5c34760e6cdd7eb93ef9f9: Status 404 returned error can't find the container with id 9c0be0cddf57e0fdcce00f8c6024b54e0b07db905f5c34760e6cdd7eb93ef9f9 Apr 20 11:42:15.008467 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.008330 2585 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T11:42:14.415109415Z","UUID":"5148f4d4-2582-46c2-80f8-c40eb5b62d92","Handler":null,"Name":"","Endpoint":""} Apr 20 11:42:15.010348 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.010307 2585 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 11:42:15.010348 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.010338 2585 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 11:42:15.058512 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.058486 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:15.058512 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.058506 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:15.059674 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:15.058600 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:15.059674 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.058620 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:15.059674 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:15.058721 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:15.059674 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:15.058799 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:15.145261 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.145218 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plzm6" event={"ID":"80706795-3e55-4fb6-9b83-da08f0522340","Type":"ContainerStarted","Data":"9ba9d2bccc808e39c6757534f30c8bcaac104d5ad65edd15693ed87566678ce6"} Apr 20 11:42:15.145261 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.145261 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plzm6" event={"ID":"80706795-3e55-4fb6-9b83-da08f0522340","Type":"ContainerStarted","Data":"9c0be0cddf57e0fdcce00f8c6024b54e0b07db905f5c34760e6cdd7eb93ef9f9"} Apr 20 11:42:15.147470 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.147441 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" event={"ID":"8e0954af-e279-48c2-8485-2e1a2c5da32f","Type":"ContainerStarted","Data":"09883a40b55ab7a1739c80fed893bfd1b43ce3e725121dd70795804337f2430f"} Apr 20 11:42:15.159148 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:15.159106 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-plzm6" podStartSLOduration=1.159093914 podStartE2EDuration="1.159093914s" podCreationTimestamp="2026-04-20 11:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:15.158834186 +0000 UTC m=+22.631935265" watchObservedRunningTime="2026-04-20 11:42:15.159093914 +0000 UTC m=+22.632195000" Apr 20 11:42:16.152134 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:16.152100 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:42:16.152727 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:16.152488 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"9a4c531144c8ccfb49a5953d9760775ae71b32167e2087b82ded7de6ffc6a10c"} Apr 20 11:42:16.154399 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:16.154367 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" event={"ID":"8e0954af-e279-48c2-8485-2e1a2c5da32f","Type":"ContainerStarted","Data":"a385240c1b9e318b01e0eed18b7fee0d211e2c229d9ec4d860a2d698266481d2"} Apr 20 11:42:16.190006 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:16.189959 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wt4qp" podStartSLOduration=3.51469974 podStartE2EDuration="23.189947466s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.92254381 +0000 UTC m=+3.395644878" lastFinishedPulling="2026-04-20 11:42:15.597791532 +0000 UTC m=+23.070892604" observedRunningTime="2026-04-20 11:42:16.189656414 +0000 UTC m=+23.662757503" watchObservedRunningTime="2026-04-20 11:42:16.189947466 +0000 UTC m=+23.663048552" Apr 20 11:42:17.050385 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:17.050335 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:17.050586 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:17.050335 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:17.050586 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:17.050478 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:17.050586 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:17.050475 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:17.050795 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:17.050610 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:17.050795 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:17.050715 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:17.662589 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:17.662547 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:42:17.663552 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:17.663526 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:42:18.158514 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:18.158481 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:42:18.159044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:18.159027 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-92zb6" Apr 20 11:42:19.050366 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.050330 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:19.050878 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.050330 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:19.050878 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:19.050453 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:19.050878 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:19.050517 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:19.050878 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.050564 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:19.050878 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:19.050721 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:19.161708 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.161659 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="4ce1b6462a96c8c7d5bb956196f45d8c505c7a770a5138fe764f85f69d4ed8f3" exitCode=0 Apr 20 11:42:19.161866 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.161747 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"4ce1b6462a96c8c7d5bb956196f45d8c505c7a770a5138fe764f85f69d4ed8f3"} Apr 20 11:42:19.164983 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.164967 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:42:19.165282 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.165258 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"2545bdc479083d4c7f3efdb05b751af5f966ebecdbe62368ffc83ea78fae0236"} Apr 20 11:42:19.165680 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.165660 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:19.165782 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.165705 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:19.165912 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.165899 2585 scope.go:117] "RemoveContainer" containerID="ffb6ea7afa5d7d680ac5264db11558c4a695ff82e2547c5a50f9f343c886cb2e" Apr 20 11:42:19.182090 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:19.182026 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:20.170174 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.169961 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="7d0c9905701bc6d641418469006d01be511d21c355ba2c217c7e62e69c0eeef8" exitCode=0 Apr 20 11:42:20.170572 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.170043 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"7d0c9905701bc6d641418469006d01be511d21c355ba2c217c7e62e69c0eeef8"} Apr 20 11:42:20.174675 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.174654 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:42:20.175104 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.175075 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" event={"ID":"9a0e8a26-b424-497e-b30b-d497b9949b05","Type":"ContainerStarted","Data":"8698e018bdf1f80b8fb7a6818f4aefe578e2bd8df87378e2c88854b15e676660"} Apr 20 11:42:20.175489 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.175466 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:20.189885 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.189860 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:20.223628 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.223541 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" podStartSLOduration=10.364955862 podStartE2EDuration="27.223527981s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.922526181 +0000 UTC m=+3.395627249" lastFinishedPulling="2026-04-20 11:42:12.781098292 +0000 UTC m=+20.254199368" observedRunningTime="2026-04-20 11:42:20.223175228 +0000 UTC m=+27.696276325" watchObservedRunningTime="2026-04-20 11:42:20.223527981 +0000 UTC m=+27.696629066" Apr 20 11:42:20.373519 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.373478 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gl2dq"] Apr 20 11:42:20.373673 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.373621 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:20.373785 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:20.373759 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:20.378419 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.378392 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-48wpt"] Apr 20 11:42:20.378537 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.378499 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:20.378625 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:20.378604 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:20.382668 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.382648 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mrrxd"] Apr 20 11:42:20.382760 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:20.382751 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:20.382830 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:20.382815 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:21.179495 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:21.179426 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="98a9e7078d76dea404197cc5a521f12bed665a19efffdce453ecb56671b48baa" exitCode=0 Apr 20 11:42:21.179984 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:21.179558 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"98a9e7078d76dea404197cc5a521f12bed665a19efffdce453ecb56671b48baa"} Apr 20 11:42:22.050586 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:22.050551 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:22.050586 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:22.050555 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:22.050836 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:22.050555 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:22.050836 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:22.050749 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:22.050836 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:22.050784 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:22.050960 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:22.050864 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:24.050329 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:24.050291 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:24.051083 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:24.050299 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:24.051083 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:24.050412 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mrrxd" podUID="bceb7c3e-9a84-4f27-8b25-3497e4f2353e" Apr 20 11:42:24.051083 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:24.050492 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-48wpt" podUID="07c8c3fc-2976-4ee9-904f-92f6a777b537" Apr 20 11:42:24.051083 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:24.050300 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:24.051083 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:24.050593 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gl2dq" podUID="3f86abc4-981a-497f-8da8-2b998417e124" Apr 20 11:42:25.915547 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.915317 2585 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-26.ec2.internal" event="NodeReady" Apr 20 11:42:25.916105 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.915670 2585 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 11:42:25.975176 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.975140 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:42:25.979497 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.979464 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:25.982769 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.982578 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jqbrx\"" Apr 20 11:42:25.982769 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.982648 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 11:42:25.982769 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.982654 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 11:42:25.982769 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.982654 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 11:42:25.993061 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.991849 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 11:42:25.993061 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.993029 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:42:25.997257 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:25.996787 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cdjdz"] Apr 20 11:42:26.000347 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.000303 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9"] Apr 20 11:42:26.000541 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.000516 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.003118 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.003100 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf"] Apr 20 11:42:26.003265 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.003247 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.004516 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.004385 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 11:42:26.004624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.004571 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 11:42:26.004683 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.004628 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5crpn\"" Apr 20 11:42:26.005927 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.005910 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s"] Apr 20 11:42:26.006084 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.006062 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" Apr 20 11:42:26.007313 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.007296 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 11:42:26.008634 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.008616 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rbgjz"] Apr 20 11:42:26.008832 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.008808 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 11:42:26.008943 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.008927 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-s6smf\"" Apr 20 11:42:26.009005 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.008994 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 11:42:26.009164 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.009147 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 11:42:26.009239 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.009164 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-d7bp9\"" Apr 20 11:42:26.011571 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.011476 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.011571 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.011535 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.013718 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.013655 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9"] Apr 20 11:42:26.013828 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.013795 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdjdz"] Apr 20 11:42:26.016153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.015334 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 11:42:26.016153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.015574 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 11:42:26.016153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.015655 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-twcdr\"" Apr 20 11:42:26.016153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.015969 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 11:42:26.016153 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.016006 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ltn9x\"" Apr 20 11:42:26.016712 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.016665 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 11:42:26.016793 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.016759 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:26.016907 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.016892 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbgjz"] Apr 20 11:42:26.016977 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.016962 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 11:42:26.017032 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.017018 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:26.018107 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.018088 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf"] Apr 20 11:42:26.029535 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.029514 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s"] Apr 20 11:42:26.050061 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.050038 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:26.050178 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.050083 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:26.050259 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.050247 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:26.055329 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.055307 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 11:42:26.055329 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.055334 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 11:42:26.055581 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.055566 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:42:26.055644 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.055610 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dmnrq\"" Apr 20 11:42:26.110243 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110212 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110274 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wmq\" (UniqueName: \"kubernetes.io/projected/920f60a6-ca33-484b-b844-63588b7c2913-kube-api-access-k4wmq\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.110407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110302 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110331 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/920f60a6-ca33-484b-b844-63588b7c2913-tmp-dir\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.110407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110354 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110398 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110420 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110449 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkbw\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110475 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110498 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110523 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110548 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba008230-f452-4546-9072-0f9d9eca2357-config\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.110607 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110593 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mb7c\" (UniqueName: \"kubernetes.io/projected/ba008230-f452-4546-9072-0f9d9eca2357-kube-api-access-4mb7c\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110652 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/920f60a6-ca33-484b-b844-63588b7c2913-config-volume\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110681 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftgj\" (UniqueName: \"kubernetes.io/projected/c665ef3f-dfe6-4608-901f-51a3bf39c346-kube-api-access-jftgj\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110721 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/559eea80-d496-4a24-9a29-8c322f86b200-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110745 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110761 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110781 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgxx\" (UniqueName: \"kubernetes.io/projected/ccc123a9-d94a-4fd5-9328-4df77651bc0e-kube-api-access-6xgxx\") pod \"network-check-source-8894fc9bd-lg2zf\" (UID: \"ccc123a9-d94a-4fd5-9328-4df77651bc0e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" Apr 20 11:42:26.110900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.110833 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba008230-f452-4546-9072-0f9d9eca2357-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212056 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212116 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212143 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgxx\" (UniqueName: \"kubernetes.io/projected/ccc123a9-d94a-4fd5-9328-4df77651bc0e-kube-api-access-6xgxx\") pod \"network-check-source-8894fc9bd-lg2zf\" (UID: \"ccc123a9-d94a-4fd5-9328-4df77651bc0e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212174 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba008230-f452-4546-9072-0f9d9eca2357-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212231 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.212260 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212309 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wmq\" (UniqueName: \"kubernetes.io/projected/920f60a6-ca33-484b-b844-63588b7c2913-kube-api-access-k4wmq\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.212341 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.712320538 +0000 UTC m=+34.185421621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212374 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212458 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/920f60a6-ca33-484b-b844-63588b7c2913-tmp-dir\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212486 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212531 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212557 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212580 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkbw\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212606 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.212778 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212671 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212718 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212742 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba008230-f452-4546-9072-0f9d9eca2357-config\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212764 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mb7c\" (UniqueName: \"kubernetes.io/projected/ba008230-f452-4546-9072-0f9d9eca2357-kube-api-access-4mb7c\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212791 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/920f60a6-ca33-484b-b844-63588b7c2913-config-volume\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212824 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jftgj\" (UniqueName: \"kubernetes.io/projected/c665ef3f-dfe6-4608-901f-51a3bf39c346-kube-api-access-jftgj\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212842 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/920f60a6-ca33-484b-b844-63588b7c2913-tmp-dir\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212852 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/559eea80-d496-4a24-9a29-8c322f86b200-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.212963 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213003 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213018 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213130 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213185 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.71316878 +0000 UTC m=+34.186269843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213380 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.713364826 +0000 UTC m=+34.186465905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.213427 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba008230-f452-4546-9072-0f9d9eca2357-config\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213503 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.213505 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.213622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.213531 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/920f60a6-ca33-484b-b844-63588b7c2913-config-volume\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.214330 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.213547 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.713535382 +0000 UTC m=+34.186636473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:26.214330 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.213614 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.214766 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.214454 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/559eea80-d496-4a24-9a29-8c322f86b200-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.217313 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.217266 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba008230-f452-4546-9072-0f9d9eca2357-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.217448 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.217430 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.217507 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.217492 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.224961 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.224937 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mb7c\" (UniqueName: \"kubernetes.io/projected/ba008230-f452-4546-9072-0f9d9eca2357-kube-api-access-4mb7c\") pod \"service-ca-operator-d6fc45fc5-9pc7s\" (UID: \"ba008230-f452-4546-9072-0f9d9eca2357\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.225658 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.225634 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkbw\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.225874 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.225852 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgxx\" (UniqueName: \"kubernetes.io/projected/ccc123a9-d94a-4fd5-9328-4df77651bc0e-kube-api-access-6xgxx\") pod \"network-check-source-8894fc9bd-lg2zf\" (UID: \"ccc123a9-d94a-4fd5-9328-4df77651bc0e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" Apr 20 11:42:26.225943 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.225888 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftgj\" (UniqueName: \"kubernetes.io/projected/c665ef3f-dfe6-4608-901f-51a3bf39c346-kube-api-access-jftgj\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.226010 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.225993 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wmq\" (UniqueName: \"kubernetes.io/projected/920f60a6-ca33-484b-b844-63588b7c2913-kube-api-access-k4wmq\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.226486 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.226465 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.327628 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.327590 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" Apr 20 11:42:26.341755 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.341731 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" Apr 20 11:42:26.715746 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.715701 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:26.715746 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.715753 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.715782 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.715834 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715856 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715879 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715915 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715939 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.715919719 +0000 UTC m=+35.189020807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715947 2585 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715958 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.71594503 +0000 UTC m=+35.189046118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715855 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.715987 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.715980146 +0000 UTC m=+35.189081212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:26.715995 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.715865 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:26.716576 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.716011 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs podName:3f86abc4-981a-497f-8da8-2b998417e124 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:58.71599557 +0000 UTC m=+66.189096642 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs") pod "network-metrics-daemon-gl2dq" (UID: "3f86abc4-981a-497f-8da8-2b998417e124") : secret "metrics-daemon-secret" not found Apr 20 11:42:26.716576 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.716049 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:26.716576 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:26.716080 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.716070856 +0000 UTC m=+35.189171920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:26.817448 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.817407 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:26.820207 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.820182 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/07c8c3fc-2976-4ee9-904f-92f6a777b537-original-pull-secret\") pod \"global-pull-secret-syncer-48wpt\" (UID: \"07c8c3fc-2976-4ee9-904f-92f6a777b537\") " pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:26.918227 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.918190 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:26.921400 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.921372 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwgq\" (UniqueName: \"kubernetes.io/projected/bceb7c3e-9a84-4f27-8b25-3497e4f2353e-kube-api-access-qpwgq\") pod \"network-check-target-mrrxd\" (UID: \"bceb7c3e-9a84-4f27-8b25-3497e4f2353e\") " pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:26.960419 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.960387 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48wpt" Apr 20 11:42:26.967209 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:26.967140 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:27.450378 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.450349 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mrrxd"] Apr 20 11:42:27.453330 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.453306 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-48wpt"] Apr 20 11:42:27.456858 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.456834 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s"] Apr 20 11:42:27.459341 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.459317 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf"] Apr 20 11:42:27.479853 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:27.479819 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbceb7c3e_9a84_4f27_8b25_3497e4f2353e.slice/crio-26adf402d2a7af7091342732a58fe926495573049d3ff4285f2ca0040e03f32c WatchSource:0}: Error finding container 26adf402d2a7af7091342732a58fe926495573049d3ff4285f2ca0040e03f32c: Status 404 returned error can't find the container with id 26adf402d2a7af7091342732a58fe926495573049d3ff4285f2ca0040e03f32c Apr 20 11:42:27.480656 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:27.480630 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c8c3fc_2976_4ee9_904f_92f6a777b537.slice/crio-ab0bd75875bb23db0fe53128e75fcfcf5017f07593205346dc2f6f2ffe044b79 WatchSource:0}: Error finding container ab0bd75875bb23db0fe53128e75fcfcf5017f07593205346dc2f6f2ffe044b79: Status 404 returned error can't find the container with id ab0bd75875bb23db0fe53128e75fcfcf5017f07593205346dc2f6f2ffe044b79 Apr 20 11:42:27.481351 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:27.481321 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba008230_f452_4546_9072_0f9d9eca2357.slice/crio-762f2cc5479027c06db60794a39b0740a050848aa720e55fcea5d84c3a32efac WatchSource:0}: Error finding container 762f2cc5479027c06db60794a39b0740a050848aa720e55fcea5d84c3a32efac: Status 404 returned error can't find the container with id 762f2cc5479027c06db60794a39b0740a050848aa720e55fcea5d84c3a32efac Apr 20 11:42:27.482183 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:27.482161 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccc123a9_d94a_4fd5_9328_4df77651bc0e.slice/crio-9f6596297048907bd409f7a11d9d3857a4abcaf5aaef69bedddfcf565c1a901c WatchSource:0}: Error finding container 9f6596297048907bd409f7a11d9d3857a4abcaf5aaef69bedddfcf565c1a901c: Status 404 returned error can't find the container with id 9f6596297048907bd409f7a11d9d3857a4abcaf5aaef69bedddfcf565c1a901c Apr 20 11:42:27.726340 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.726153 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:27.726522 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.726352 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:27.726522 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.726371 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:27.726522 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:27.726403 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:27.726522 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726309 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:27.726522 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726512 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:27.726920 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726530 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.726502434 +0000 UTC m=+37.199603589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:27.726920 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726563 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.72655089 +0000 UTC m=+37.199651955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:27.727036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726948 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:27.727036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726952 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:27.727036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726972 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:27.727036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.726998 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.726985508 +0000 UTC m=+37.200086572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:27.727036 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:27.727023 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.727006025 +0000 UTC m=+37.200107110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:28.193654 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.193613 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-48wpt" event={"ID":"07c8c3fc-2976-4ee9-904f-92f6a777b537","Type":"ContainerStarted","Data":"ab0bd75875bb23db0fe53128e75fcfcf5017f07593205346dc2f6f2ffe044b79"} Apr 20 11:42:28.194945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.194896 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mrrxd" event={"ID":"bceb7c3e-9a84-4f27-8b25-3497e4f2353e","Type":"ContainerStarted","Data":"26adf402d2a7af7091342732a58fe926495573049d3ff4285f2ca0040e03f32c"} Apr 20 11:42:28.197073 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.197023 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" event={"ID":"ccc123a9-d94a-4fd5-9328-4df77651bc0e","Type":"ContainerStarted","Data":"9f6596297048907bd409f7a11d9d3857a4abcaf5aaef69bedddfcf565c1a901c"} Apr 20 11:42:28.201814 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.201787 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="16a6c5b72bf200760dc914da19e3678881284beeffda7753fc075153baaec916" exitCode=0 Apr 20 11:42:28.201915 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.201868 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"16a6c5b72bf200760dc914da19e3678881284beeffda7753fc075153baaec916"} Apr 20 11:42:28.203934 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:28.203908 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" event={"ID":"ba008230-f452-4546-9072-0f9d9eca2357","Type":"ContainerStarted","Data":"762f2cc5479027c06db60794a39b0740a050848aa720e55fcea5d84c3a32efac"} Apr 20 11:42:29.210658 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.210621 2585 generic.go:358] "Generic (PLEG): container finished" podID="7648da60-df45-45eb-92a1-f0b097849361" containerID="f7cb0895d5133b57313498e4c61870fa67c995615764f1468b764749683c00e2" exitCode=0 Apr 20 11:42:29.211588 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.210676 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerDied","Data":"f7cb0895d5133b57313498e4c61870fa67c995615764f1468b764749683c00e2"} Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.744885 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.744936 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.744967 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:29.745015 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745256 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745329 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:33.745310527 +0000 UTC m=+41.218411598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745662 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745720 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:33.745702878 +0000 UTC m=+41.218803961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745810 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745818 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745853 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:33.745845122 +0000 UTC m=+41.218946186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745890 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:29.746088 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:29.745991 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:33.745906884 +0000 UTC m=+41.219007964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:30.218857 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:30.218817 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" event={"ID":"7648da60-df45-45eb-92a1-f0b097849361","Type":"ContainerStarted","Data":"56bfe520cdf3a02d7c6401b4866f12278737082078dfe60542ff79082ac4c70f"} Apr 20 11:42:30.246529 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:30.245369 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2kbqc" podStartSLOduration=5.649734678 podStartE2EDuration="37.245348868s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:41:55.918760245 +0000 UTC m=+3.391861320" lastFinishedPulling="2026-04-20 11:42:27.514374446 +0000 UTC m=+34.987475510" observedRunningTime="2026-04-20 11:42:30.243888718 +0000 UTC m=+37.716989805" watchObservedRunningTime="2026-04-20 11:42:30.245348868 +0000 UTC m=+37.718449955" Apr 20 11:42:33.780876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:33.780837 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:33.780876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:33.780880 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:33.780907 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:33.780956 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781004 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781032 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781048 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781058 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781071 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:41.781051176 +0000 UTC m=+49.254152239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781040 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781104 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:41.781086855 +0000 UTC m=+49.254187922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781124 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:41.781114634 +0000 UTC m=+49.254215703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:33.781584 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:33.781141 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:41.781133081 +0000 UTC m=+49.254234147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:34.227328 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.227290 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-48wpt" event={"ID":"07c8c3fc-2976-4ee9-904f-92f6a777b537","Type":"ContainerStarted","Data":"36df22a3503e7de88e52aac105439770416c495e44e4e829bf8c993a25f710c1"} Apr 20 11:42:34.228672 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.228647 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mrrxd" event={"ID":"bceb7c3e-9a84-4f27-8b25-3497e4f2353e","Type":"ContainerStarted","Data":"d58dfedcf5c9f90f2d357e3646a3495ecb688ca7c139ee0cf65a2f7a2e3b774a"} Apr 20 11:42:34.228816 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.228772 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:42:34.229981 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.229959 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" event={"ID":"ccc123a9-d94a-4fd5-9328-4df77651bc0e","Type":"ContainerStarted","Data":"23750d050a344f240408c301c6e8784d5e8eaa17fddbc2fd1a201b861e9557af"} Apr 20 11:42:34.231235 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.231208 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" event={"ID":"ba008230-f452-4546-9072-0f9d9eca2357","Type":"ContainerStarted","Data":"9e7f77ac85dcc1078ee33a0a74be32ec319cfb04e4a8cca4732b8558d4aa7796"} Apr 20 11:42:34.244711 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.244651 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-48wpt" podStartSLOduration=34.357121825 podStartE2EDuration="40.24464027s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:42:27.490257468 +0000 UTC m=+34.963358532" lastFinishedPulling="2026-04-20 11:42:33.377775899 +0000 UTC m=+40.850876977" observedRunningTime="2026-04-20 11:42:34.243301477 +0000 UTC m=+41.716402564" watchObservedRunningTime="2026-04-20 11:42:34.24464027 +0000 UTC m=+41.717741355" Apr 20 11:42:34.282802 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.282727 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mrrxd" podStartSLOduration=35.386439053 podStartE2EDuration="41.28271153s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:42:27.490168424 +0000 UTC m=+34.963269489" lastFinishedPulling="2026-04-20 11:42:33.386440898 +0000 UTC m=+40.859541966" observedRunningTime="2026-04-20 11:42:34.267143223 +0000 UTC m=+41.740244312" watchObservedRunningTime="2026-04-20 11:42:34.28271153 +0000 UTC m=+41.755812612" Apr 20 11:42:34.282982 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.282953 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lg2zf" podStartSLOduration=35.395549995 podStartE2EDuration="41.282947108s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:42:27.49022096 +0000 UTC m=+34.963322031" lastFinishedPulling="2026-04-20 11:42:33.37761807 +0000 UTC m=+40.850719144" observedRunningTime="2026-04-20 11:42:34.282706458 +0000 UTC m=+41.755807540" watchObservedRunningTime="2026-04-20 11:42:34.282947108 +0000 UTC m=+41.756048195" Apr 20 11:42:34.304088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:34.304046 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" podStartSLOduration=34.417002211 podStartE2EDuration="40.304030703s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:42:27.490178668 +0000 UTC m=+34.963279737" lastFinishedPulling="2026-04-20 11:42:33.37720715 +0000 UTC m=+40.850308229" observedRunningTime="2026-04-20 11:42:34.303570643 +0000 UTC m=+41.776671721" watchObservedRunningTime="2026-04-20 11:42:34.304030703 +0000 UTC m=+41.777131790" Apr 20 11:42:38.959301 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.959271 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ljhs9"] Apr 20 11:42:38.961982 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.961966 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:38.965249 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.965229 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 11:42:38.965366 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.965289 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 11:42:38.966426 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.966408 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 11:42:38.966523 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.966408 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 11:42:38.966523 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.966415 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-l8sgc\"" Apr 20 11:42:38.971634 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:38.971609 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ljhs9"] Apr 20 11:42:39.016075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.016046 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-key\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.016204 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.016091 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmkx\" (UniqueName: \"kubernetes.io/projected/cfeb0647-4980-4ec3-8246-117ecbebb052-kube-api-access-6lmkx\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.016204 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.016112 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-cabundle\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.116519 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.116483 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-key\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.116625 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.116544 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmkx\" (UniqueName: \"kubernetes.io/projected/cfeb0647-4980-4ec3-8246-117ecbebb052-kube-api-access-6lmkx\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.116625 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.116575 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-cabundle\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.117286 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.117270 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-cabundle\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.120364 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.120345 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cfeb0647-4980-4ec3-8246-117ecbebb052-signing-key\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.125774 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.125751 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmkx\" (UniqueName: \"kubernetes.io/projected/cfeb0647-4980-4ec3-8246-117ecbebb052-kube-api-access-6lmkx\") pod \"service-ca-865cb79987-ljhs9\" (UID: \"cfeb0647-4980-4ec3-8246-117ecbebb052\") " pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.274781 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.274681 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ljhs9" Apr 20 11:42:39.392476 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:39.392449 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ljhs9"] Apr 20 11:42:39.395194 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:39.395167 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfeb0647_4980_4ec3_8246_117ecbebb052.slice/crio-adea6353fdc59945e0afb5e77ad5ca30207293bb2dfc2c00e5dba7e8a071d5d6 WatchSource:0}: Error finding container adea6353fdc59945e0afb5e77ad5ca30207293bb2dfc2c00e5dba7e8a071d5d6: Status 404 returned error can't find the container with id adea6353fdc59945e0afb5e77ad5ca30207293bb2dfc2c00e5dba7e8a071d5d6 Apr 20 11:42:40.245099 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:40.245059 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ljhs9" event={"ID":"cfeb0647-4980-4ec3-8246-117ecbebb052","Type":"ContainerStarted","Data":"1774862078fcd1d04235b1f82e58152a0cb39e17c2d11c7fb3e35c82d5df64af"} Apr 20 11:42:40.245099 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:40.245095 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ljhs9" event={"ID":"cfeb0647-4980-4ec3-8246-117ecbebb052","Type":"ContainerStarted","Data":"adea6353fdc59945e0afb5e77ad5ca30207293bb2dfc2c00e5dba7e8a071d5d6"} Apr 20 11:42:40.274449 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:40.274410 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ljhs9" podStartSLOduration=2.27439604 podStartE2EDuration="2.27439604s" podCreationTimestamp="2026-04-20 11:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:40.274029365 +0000 UTC m=+47.747130451" watchObservedRunningTime="2026-04-20 11:42:40.27439604 +0000 UTC m=+47.747497126" Apr 20 11:42:41.836635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:41.836594 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:41.836661 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:41.836680 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:41.836715 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836762 2585 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836805 2585 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836810 2585 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836832 2585 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl: secret "image-registry-tls" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836806 2585 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836834 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert podName:559eea80-d496-4a24-9a29-8c322f86b200 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:57.836816551 +0000 UTC m=+65.309917615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6rsg9" (UID: "559eea80-d496-4a24-9a29-8c322f86b200") : secret "networking-console-plugin-cert" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836879 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls podName:920f60a6-ca33-484b-b844-63588b7c2913 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:57.836864885 +0000 UTC m=+65.309965950 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls") pod "dns-default-cdjdz" (UID: "920f60a6-ca33-484b-b844-63588b7c2913") : secret "dns-default-metrics-tls" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836899 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls podName:3b617c5e-ef25-4c2f-b279-519030aa35e0 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:57.836889062 +0000 UTC m=+65.309990127 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls") pod "image-registry-7bdbfd9bcc-nhjwl" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0") : secret "image-registry-tls" not found Apr 20 11:42:41.837147 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:42:41.836922 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert podName:c665ef3f-dfe6-4608-901f-51a3bf39c346 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:57.836911397 +0000 UTC m=+65.310012463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert") pod "ingress-canary-rbgjz" (UID: "c665ef3f-dfe6-4608-901f-51a3bf39c346") : secret "canary-serving-cert" not found Apr 20 11:42:52.192214 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:52.192184 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6c62" Apr 20 11:42:57.800172 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.800137 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l"] Apr 20 11:42:57.807000 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.806976 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.809899 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.809879 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.811235 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.811214 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 11:42:57.811553 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.811535 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.811553 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.811543 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sfws4\"" Apr 20 11:42:57.815079 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.815054 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l"] Apr 20 11:42:57.851270 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851243 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:57.851363 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851300 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:57.851363 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851328 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:57.851363 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851346 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:57.851475 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851365 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.851475 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.851392 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx2k\" (UniqueName: \"kubernetes.io/projected/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-kube-api-access-bkx2k\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.853882 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.853862 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"image-registry-7bdbfd9bcc-nhjwl\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:57.853974 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.853883 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/920f60a6-ca33-484b-b844-63588b7c2913-metrics-tls\") pod \"dns-default-cdjdz\" (UID: \"920f60a6-ca33-484b-b844-63588b7c2913\") " pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:57.853974 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.853940 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/559eea80-d496-4a24-9a29-8c322f86b200-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6rsg9\" (UID: \"559eea80-d496-4a24-9a29-8c322f86b200\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:57.853974 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.853962 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c665ef3f-dfe6-4608-901f-51a3bf39c346-cert\") pod \"ingress-canary-rbgjz\" (UID: \"c665ef3f-dfe6-4608-901f-51a3bf39c346\") " pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:57.915561 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.915529 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4hfdx"] Apr 20 11:42:57.918679 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.918664 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc"] Apr 20 11:42:57.918826 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.918811 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.922017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.921999 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:57.923214 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.923191 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.923837 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.923806 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 11:42:57.924376 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.924354 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 11:42:57.924817 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.924592 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-fj8qj\"" Apr 20 11:42:57.925269 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.925248 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.925339 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.925285 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 11:42:57.926924 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.926901 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.927229 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.927207 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.927620 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.927601 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fgg2q\"" Apr 20 11:42:57.929403 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.929385 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 11:42:57.931281 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.931258 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 11:42:57.937979 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.937960 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4hfdx"] Apr 20 11:42:57.939185 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.939164 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc"] Apr 20 11:42:57.951822 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.951797 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-snapshots\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.951952 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.951838 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-service-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.951952 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.951877 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.951952 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.951942 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-tmp\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.952127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.951997 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkx2k\" (UniqueName: \"kubernetes.io/projected/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-kube-api-access-bkx2k\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.952127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952033 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:57.952127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952063 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.952127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952094 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2p64\" (UniqueName: \"kubernetes.io/projected/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-kube-api-access-n2p64\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:57.952319 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952145 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:57.952319 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952179 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98d5g\" (UniqueName: \"kubernetes.io/projected/4f3fdecc-690d-48d0-95a7-c7427f0f366b-kube-api-access-98d5g\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.952319 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.952221 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fdecc-690d-48d0-95a7-c7427f0f366b-serving-cert\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:57.954490 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.954467 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:57.963947 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:57.963929 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkx2k\" (UniqueName: \"kubernetes.io/projected/cb4b3505-fd96-4a34-9b8a-35b95c2afdec-kube-api-access-bkx2k\") pod \"cluster-samples-operator-6dc5bdb6b4-xlb7l\" (UID: \"cb4b3505-fd96-4a34-9b8a-35b95c2afdec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:58.053162 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053078 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.053162 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053109 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053162 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053131 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2p64\" (UniqueName: \"kubernetes.io/projected/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-kube-api-access-n2p64\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.053162 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053163 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.053469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053182 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98d5g\" (UniqueName: \"kubernetes.io/projected/4f3fdecc-690d-48d0-95a7-c7427f0f366b-kube-api-access-98d5g\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053197 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fdecc-690d-48d0-95a7-c7427f0f366b-serving-cert\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053217 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-snapshots\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053241 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-service-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053469 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053281 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-tmp\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.053934 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.053764 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-tmp\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.054104 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.054043 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-service-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.054202 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.054097 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.054371 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.054349 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4f3fdecc-690d-48d0-95a7-c7427f0f366b-snapshots\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.054435 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.054423 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fdecc-690d-48d0-95a7-c7427f0f366b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.055842 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.055822 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.056197 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.056181 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fdecc-690d-48d0-95a7-c7427f0f366b-serving-cert\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.065180 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.065156 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2p64\" (UniqueName: \"kubernetes.io/projected/6af6cd58-fc4f-4628-8259-91d3ffbbcea7-kube-api-access-n2p64\") pod \"cluster-monitoring-operator-75587bd455-xvfqc\" (UID: \"6af6cd58-fc4f-4628-8259-91d3ffbbcea7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.068561 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.068544 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98d5g\" (UniqueName: \"kubernetes.io/projected/4f3fdecc-690d-48d0-95a7-c7427f0f366b-kube-api-access-98d5g\") pod \"insights-operator-585dfdc468-4hfdx\" (UID: \"4f3fdecc-690d-48d0-95a7-c7427f0f366b\") " pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.095621 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.095597 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jqbrx\"" Apr 20 11:42:58.103738 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.103716 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:58.114325 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.114309 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5crpn\"" Apr 20 11:42:58.115375 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.115362 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" Apr 20 11:42:58.123212 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.123048 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdjdz" Apr 20 11:42:58.123287 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.123249 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-s6smf\"" Apr 20 11:42:58.131245 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.131220 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" Apr 20 11:42:58.139643 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.139494 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-twcdr\"" Apr 20 11:42:58.146382 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.146357 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbgjz" Apr 20 11:42:58.231910 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.231877 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4hfdx" Apr 20 11:42:58.237819 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.237472 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" Apr 20 11:42:58.281872 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.279961 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:42:58.284433 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.284380 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b617c5e_ef25_4c2f_b279_519030aa35e0.slice/crio-d5a61cd1fad0f2caa12e7f0aec82ad3870f9772d7d757903e6ed2e4148adf665 WatchSource:0}: Error finding container d5a61cd1fad0f2caa12e7f0aec82ad3870f9772d7d757903e6ed2e4148adf665: Status 404 returned error can't find the container with id d5a61cd1fad0f2caa12e7f0aec82ad3870f9772d7d757903e6ed2e4148adf665 Apr 20 11:42:58.297616 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.297571 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" event={"ID":"3b617c5e-ef25-4c2f-b279-519030aa35e0","Type":"ContainerStarted","Data":"d5a61cd1fad0f2caa12e7f0aec82ad3870f9772d7d757903e6ed2e4148adf665"} Apr 20 11:42:58.299609 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.299583 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l"] Apr 20 11:42:58.328222 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.327432 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdjdz"] Apr 20 11:42:58.333354 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.333274 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod920f60a6_ca33_484b_b844_63588b7c2913.slice/crio-c4e52fce6b732cf372f4aefea67d9bc8c82cbb715d651573084f956ea0be613e WatchSource:0}: Error finding container c4e52fce6b732cf372f4aefea67d9bc8c82cbb715d651573084f956ea0be613e: Status 404 returned error can't find the container with id c4e52fce6b732cf372f4aefea67d9bc8c82cbb715d651573084f956ea0be613e Apr 20 11:42:58.380522 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.380465 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9"] Apr 20 11:42:58.382372 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.382227 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559eea80_d496_4a24_9a29_8c322f86b200.slice/crio-d05a4a60c8c9ca9a95074808d3b03cf822cfd063cf8467f622b6e279f50b4b55 WatchSource:0}: Error finding container d05a4a60c8c9ca9a95074808d3b03cf822cfd063cf8467f622b6e279f50b4b55: Status 404 returned error can't find the container with id d05a4a60c8c9ca9a95074808d3b03cf822cfd063cf8467f622b6e279f50b4b55 Apr 20 11:42:58.407794 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.407766 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc"] Apr 20 11:42:58.412218 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.412190 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6af6cd58_fc4f_4628_8259_91d3ffbbcea7.slice/crio-3be81bc2d910b5356d5931cec251afc3f6ce927f6d7038b870dbcf4e061df153 WatchSource:0}: Error finding container 3be81bc2d910b5356d5931cec251afc3f6ce927f6d7038b870dbcf4e061df153: Status 404 returned error can't find the container with id 3be81bc2d910b5356d5931cec251afc3f6ce927f6d7038b870dbcf4e061df153 Apr 20 11:42:58.426513 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.426486 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4hfdx"] Apr 20 11:42:58.429149 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.429125 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3fdecc_690d_48d0_95a7_c7427f0f366b.slice/crio-a1c2a5dcbdf86350f566193a18eba629b6a05417a68c747fd2933588fa78f5a2 WatchSource:0}: Error finding container a1c2a5dcbdf86350f566193a18eba629b6a05417a68c747fd2933588fa78f5a2: Status 404 returned error can't find the container with id a1c2a5dcbdf86350f566193a18eba629b6a05417a68c747fd2933588fa78f5a2 Apr 20 11:42:58.553942 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.553912 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbgjz"] Apr 20 11:42:58.557321 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.557254 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc665ef3f_dfe6_4608_901f_51a3bf39c346.slice/crio-e95d1e2cd0b6617ecf667c42baebef26c44a56180838fc8bc167a181bd595946 WatchSource:0}: Error finding container e95d1e2cd0b6617ecf667c42baebef26c44a56180838fc8bc167a181bd595946: Status 404 returned error can't find the container with id e95d1e2cd0b6617ecf667c42baebef26c44a56180838fc8bc167a181bd595946 Apr 20 11:42:58.760657 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.760621 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:58.763466 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.763437 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f86abc4-981a-497f-8da8-2b998417e124-metrics-certs\") pod \"network-metrics-daemon-gl2dq\" (UID: \"3f86abc4-981a-497f-8da8-2b998417e124\") " pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:58.776304 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.776277 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:42:58.783814 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.783786 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gl2dq" Apr 20 11:42:58.927738 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:58.927707 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gl2dq"] Apr 20 11:42:58.935379 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:42:58.935345 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f86abc4_981a_497f_8da8_2b998417e124.slice/crio-ea58288c7d6da5dd58ef591f4036b215ae917384909e3f038cf17406afafab27 WatchSource:0}: Error finding container ea58288c7d6da5dd58ef591f4036b215ae917384909e3f038cf17406afafab27: Status 404 returned error can't find the container with id ea58288c7d6da5dd58ef591f4036b215ae917384909e3f038cf17406afafab27 Apr 20 11:42:59.315868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.311590 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4hfdx" event={"ID":"4f3fdecc-690d-48d0-95a7-c7427f0f366b","Type":"ContainerStarted","Data":"a1c2a5dcbdf86350f566193a18eba629b6a05417a68c747fd2933588fa78f5a2"} Apr 20 11:42:59.315868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.313569 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbgjz" event={"ID":"c665ef3f-dfe6-4608-901f-51a3bf39c346","Type":"ContainerStarted","Data":"e95d1e2cd0b6617ecf667c42baebef26c44a56180838fc8bc167a181bd595946"} Apr 20 11:42:59.315868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.315357 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" event={"ID":"559eea80-d496-4a24-9a29-8c322f86b200","Type":"ContainerStarted","Data":"d05a4a60c8c9ca9a95074808d3b03cf822cfd063cf8467f622b6e279f50b4b55"} Apr 20 11:42:59.322866 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.322842 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gl2dq" event={"ID":"3f86abc4-981a-497f-8da8-2b998417e124","Type":"ContainerStarted","Data":"ea58288c7d6da5dd58ef591f4036b215ae917384909e3f038cf17406afafab27"} Apr 20 11:42:59.330905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.330875 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" event={"ID":"6af6cd58-fc4f-4628-8259-91d3ffbbcea7","Type":"ContainerStarted","Data":"3be81bc2d910b5356d5931cec251afc3f6ce927f6d7038b870dbcf4e061df153"} Apr 20 11:42:59.333773 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.333728 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjdz" event={"ID":"920f60a6-ca33-484b-b844-63588b7c2913","Type":"ContainerStarted","Data":"c4e52fce6b732cf372f4aefea67d9bc8c82cbb715d651573084f956ea0be613e"} Apr 20 11:42:59.337859 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.336953 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" event={"ID":"3b617c5e-ef25-4c2f-b279-519030aa35e0","Type":"ContainerStarted","Data":"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce"} Apr 20 11:42:59.337859 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.337815 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:42:59.341509 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.341465 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" event={"ID":"cb4b3505-fd96-4a34-9b8a-35b95c2afdec","Type":"ContainerStarted","Data":"eea6cc44f75be43f89a29e54c8ff242453950bb5220badc8c27509e2ebdd8faa"} Apr 20 11:42:59.360237 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:42:59.360189 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" podStartSLOduration=66.360173317 podStartE2EDuration="1m6.360173317s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:59.359038955 +0000 UTC m=+66.832140054" watchObservedRunningTime="2026-04-20 11:42:59.360173317 +0000 UTC m=+66.833274384" Apr 20 11:43:04.938155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.938120 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657"] Apr 20 11:43:04.951243 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.951170 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657"] Apr 20 11:43:04.951427 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.951336 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" Apr 20 11:43:04.954483 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.954451 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-q98rz\"" Apr 20 11:43:04.955809 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.955784 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 11:43:04.955914 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:04.955785 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 11:43:05.014321 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.014283 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xm2s\" (UniqueName: \"kubernetes.io/projected/8a7500c2-cf33-4c82-888a-5ad636ca165a-kube-api-access-6xm2s\") pod \"migrator-74bb7799d9-7b657\" (UID: \"8a7500c2-cf33-4c82-888a-5ad636ca165a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" Apr 20 11:43:05.066446 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.066414 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl"] Apr 20 11:43:05.094855 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.094780 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl"] Apr 20 11:43:05.094987 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.094952 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:05.097900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.097877 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 11:43:05.098031 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.097961 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qdhwm\"" Apr 20 11:43:05.114950 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.114925 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xm2s\" (UniqueName: \"kubernetes.io/projected/8a7500c2-cf33-4c82-888a-5ad636ca165a-kube-api-access-6xm2s\") pod \"migrator-74bb7799d9-7b657\" (UID: \"8a7500c2-cf33-4c82-888a-5ad636ca165a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" Apr 20 11:43:05.123378 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.123354 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xm2s\" (UniqueName: \"kubernetes.io/projected/8a7500c2-cf33-4c82-888a-5ad636ca165a-kube-api-access-6xm2s\") pod \"migrator-74bb7799d9-7b657\" (UID: \"8a7500c2-cf33-4c82-888a-5ad636ca165a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" Apr 20 11:43:05.215297 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.215260 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vgkcl\" (UID: \"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:05.236360 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.236338 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mrrxd" Apr 20 11:43:05.261569 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.261542 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" Apr 20 11:43:05.317533 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.316072 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vgkcl\" (UID: \"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:05.317533 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:05.317135 2585 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 11:43:05.317533 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:05.317234 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates podName:d6aab5ee-7cd7-4b3e-8f13-1f59029ad976 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:05.817213435 +0000 UTC m=+73.290314517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-vgkcl" (UID: "d6aab5ee-7cd7-4b3e-8f13-1f59029ad976") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 11:43:05.362107 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.361982 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gl2dq" event={"ID":"3f86abc4-981a-497f-8da8-2b998417e124","Type":"ContainerStarted","Data":"4b02af5e0afaf6f90f572d7900ad04fa631676861b6708937f20766e468eb6cd"} Apr 20 11:43:05.362107 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.362028 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gl2dq" event={"ID":"3f86abc4-981a-497f-8da8-2b998417e124","Type":"ContainerStarted","Data":"2fe11494e7661156c7392b53f1c5721dee2d2117f84f8a78312fcf75856a113e"} Apr 20 11:43:05.364585 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.364556 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" event={"ID":"6af6cd58-fc4f-4628-8259-91d3ffbbcea7","Type":"ContainerStarted","Data":"3e0b68bbaf641c10ae27f92f0c2194cd92bc04aa4babcac9e0f2d415157a11c2"} Apr 20 11:43:05.366808 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.366761 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjdz" event={"ID":"920f60a6-ca33-484b-b844-63588b7c2913","Type":"ContainerStarted","Data":"c88bd0a4f1dd222b4f3c1e480574807efb6aea0bd9bd666871e7d0ad9440e0b3"} Apr 20 11:43:05.366808 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.366788 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjdz" event={"ID":"920f60a6-ca33-484b-b844-63588b7c2913","Type":"ContainerStarted","Data":"d1a2aa5fc462374e0727cf792b6130bf398f737a717b1e4f0edb009465508a49"} Apr 20 11:43:05.367009 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.366973 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cdjdz" Apr 20 11:43:05.370205 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.369853 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" event={"ID":"cb4b3505-fd96-4a34-9b8a-35b95c2afdec","Type":"ContainerStarted","Data":"8760331e0200e5abb8bf4a6bbfe19fd73cfbe344b94c1b84e9c8a0013f965a8d"} Apr 20 11:43:05.370205 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.370167 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" event={"ID":"cb4b3505-fd96-4a34-9b8a-35b95c2afdec","Type":"ContainerStarted","Data":"c81f78180cf9032f1a43a7a8246f078057a1aeb742d1a6bfd7db03098ebd99f7"} Apr 20 11:43:05.374071 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.373065 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4hfdx" event={"ID":"4f3fdecc-690d-48d0-95a7-c7427f0f366b","Type":"ContainerStarted","Data":"af9a197bf05d9d6d7904b80633a995f49e862650e5eb70c9c0f79739b6c6f5a3"} Apr 20 11:43:05.377565 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.377522 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbgjz" event={"ID":"c665ef3f-dfe6-4608-901f-51a3bf39c346","Type":"ContainerStarted","Data":"2126d57fb894fc3db81f7522b2b3248f2407806c58da85f99da40635afab5c62"} Apr 20 11:43:05.378872 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.378854 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" event={"ID":"559eea80-d496-4a24-9a29-8c322f86b200","Type":"ContainerStarted","Data":"63a3553088b97d6c75f8db05d3c0680608ac81649ae78be437f597d9fde9eaa7"} Apr 20 11:43:05.401210 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.401035 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gl2dq" podStartSLOduration=67.006794851 podStartE2EDuration="1m12.401017646s" podCreationTimestamp="2026-04-20 11:41:53 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.938128561 +0000 UTC m=+66.411229628" lastFinishedPulling="2026-04-20 11:43:04.332351346 +0000 UTC m=+71.805452423" observedRunningTime="2026-04-20 11:43:05.380714578 +0000 UTC m=+72.853815658" watchObservedRunningTime="2026-04-20 11:43:05.401017646 +0000 UTC m=+72.874118733" Apr 20 11:43:05.402621 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.402593 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657"] Apr 20 11:43:05.405602 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.405553 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cdjdz" podStartSLOduration=34.409711138 podStartE2EDuration="40.405538506s" podCreationTimestamp="2026-04-20 11:42:25 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.33620269 +0000 UTC m=+65.809303763" lastFinishedPulling="2026-04-20 11:43:04.332030068 +0000 UTC m=+71.805131131" observedRunningTime="2026-04-20 11:43:05.40359489 +0000 UTC m=+72.876695978" watchObservedRunningTime="2026-04-20 11:43:05.405538506 +0000 UTC m=+72.878639596" Apr 20 11:43:05.409233 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:05.409207 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7500c2_cf33_4c82_888a_5ad636ca165a.slice/crio-c0dec185f891591ca014ec0ed00e752a6e4f935fd88d2959ecc819bb59785fab WatchSource:0}: Error finding container c0dec185f891591ca014ec0ed00e752a6e4f935fd88d2959ecc819bb59785fab: Status 404 returned error can't find the container with id c0dec185f891591ca014ec0ed00e752a6e4f935fd88d2959ecc819bb59785fab Apr 20 11:43:05.425077 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.425022 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rbgjz" podStartSLOduration=34.646494232 podStartE2EDuration="40.425004819s" podCreationTimestamp="2026-04-20 11:42:25 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.559753576 +0000 UTC m=+66.032854639" lastFinishedPulling="2026-04-20 11:43:04.338264158 +0000 UTC m=+71.811365226" observedRunningTime="2026-04-20 11:43:05.422749445 +0000 UTC m=+72.895850524" watchObservedRunningTime="2026-04-20 11:43:05.425004819 +0000 UTC m=+72.898105910" Apr 20 11:43:05.441565 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.441511 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-4hfdx" podStartSLOduration=2.539998269 podStartE2EDuration="8.441495635s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.430846305 +0000 UTC m=+65.903947372" lastFinishedPulling="2026-04-20 11:43:04.332343661 +0000 UTC m=+71.805444738" observedRunningTime="2026-04-20 11:43:05.441006108 +0000 UTC m=+72.914107195" watchObservedRunningTime="2026-04-20 11:43:05.441495635 +0000 UTC m=+72.914596721" Apr 20 11:43:05.469291 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.469238 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvfqc" podStartSLOduration=2.550954919 podStartE2EDuration="8.469219135s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.414205165 +0000 UTC m=+65.887306229" lastFinishedPulling="2026-04-20 11:43:04.33246938 +0000 UTC m=+71.805570445" observedRunningTime="2026-04-20 11:43:05.468376003 +0000 UTC m=+72.941477081" watchObservedRunningTime="2026-04-20 11:43:05.469219135 +0000 UTC m=+72.942320221" Apr 20 11:43:05.497250 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.497197 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xlb7l" podStartSLOduration=2.554947029 podStartE2EDuration="8.497178469s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.390397769 +0000 UTC m=+65.863498834" lastFinishedPulling="2026-04-20 11:43:04.332629195 +0000 UTC m=+71.805730274" observedRunningTime="2026-04-20 11:43:05.496753071 +0000 UTC m=+72.969854156" watchObservedRunningTime="2026-04-20 11:43:05.497178469 +0000 UTC m=+72.970279556" Apr 20 11:43:05.514282 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.514234 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6rsg9" podStartSLOduration=40.567194227 podStartE2EDuration="46.514224994s" podCreationTimestamp="2026-04-20 11:42:19 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.384431181 +0000 UTC m=+65.857532246" lastFinishedPulling="2026-04-20 11:43:04.331461936 +0000 UTC m=+71.804563013" observedRunningTime="2026-04-20 11:43:05.513998453 +0000 UTC m=+72.987099541" watchObservedRunningTime="2026-04-20 11:43:05.514224994 +0000 UTC m=+72.987326080" Apr 20 11:43:05.822022 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.821994 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vgkcl\" (UID: \"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:05.824622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:05.824589 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d6aab5ee-7cd7-4b3e-8f13-1f59029ad976-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vgkcl\" (UID: \"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:06.004294 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:06.004262 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:06.149043 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:06.149010 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl"] Apr 20 11:43:06.152214 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:06.152177 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6aab5ee_7cd7_4b3e_8f13_1f59029ad976.slice/crio-c4cb0fa6460be6ce09ad6a3326a83a6464332876d782c08603fbb6c55d1a6ede WatchSource:0}: Error finding container c4cb0fa6460be6ce09ad6a3326a83a6464332876d782c08603fbb6c55d1a6ede: Status 404 returned error can't find the container with id c4cb0fa6460be6ce09ad6a3326a83a6464332876d782c08603fbb6c55d1a6ede Apr 20 11:43:06.383276 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:06.383175 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" event={"ID":"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976","Type":"ContainerStarted","Data":"c4cb0fa6460be6ce09ad6a3326a83a6464332876d782c08603fbb6c55d1a6ede"} Apr 20 11:43:06.384252 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:06.384224 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" event={"ID":"8a7500c2-cf33-4c82-888a-5ad636ca165a","Type":"ContainerStarted","Data":"c0dec185f891591ca014ec0ed00e752a6e4f935fd88d2959ecc819bb59785fab"} Apr 20 11:43:07.389285 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:07.389242 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" event={"ID":"8a7500c2-cf33-4c82-888a-5ad636ca165a","Type":"ContainerStarted","Data":"7a4e63baf8c6ba1a7ec5ab6b182bd6d5d1d9e9688c4501fa2c7ca2a9b44a7a27"} Apr 20 11:43:07.389285 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:07.389290 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" event={"ID":"8a7500c2-cf33-4c82-888a-5ad636ca165a","Type":"ContainerStarted","Data":"c4dacabae7a4da12d0e074a758b7b5a457021d61c109691f6f6577f121787c86"} Apr 20 11:43:07.470854 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:07.470827 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjdz_920f60a6-ca33-484b-b844-63588b7c2913/dns/0.log" Apr 20 11:43:07.645644 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:07.645575 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjdz_920f60a6-ca33-484b-b844-63588b7c2913/kube-rbac-proxy/0.log" Apr 20 11:43:08.393834 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:08.393789 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" event={"ID":"d6aab5ee-7cd7-4b3e-8f13-1f59029ad976","Type":"ContainerStarted","Data":"409cab19686b7c6acde2d8fcd694e4a3498704938796daf4c39e38989d7f30c8"} Apr 20 11:43:08.394254 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:08.394133 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:08.398790 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:08.398768 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" Apr 20 11:43:08.411886 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:08.411839 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7b657" podStartSLOduration=2.8983651950000002 podStartE2EDuration="4.411826666s" podCreationTimestamp="2026-04-20 11:43:04 +0000 UTC" firstStartedPulling="2026-04-20 11:43:05.411514944 +0000 UTC m=+72.884616015" lastFinishedPulling="2026-04-20 11:43:06.92497642 +0000 UTC m=+74.398077486" observedRunningTime="2026-04-20 11:43:07.411225209 +0000 UTC m=+74.884326295" watchObservedRunningTime="2026-04-20 11:43:08.411826666 +0000 UTC m=+75.884927752" Apr 20 11:43:08.412010 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:08.411950 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vgkcl" podStartSLOduration=1.625604798 podStartE2EDuration="3.411943673s" podCreationTimestamp="2026-04-20 11:43:05 +0000 UTC" firstStartedPulling="2026-04-20 11:43:06.155995302 +0000 UTC m=+73.629096366" lastFinishedPulling="2026-04-20 11:43:07.942334174 +0000 UTC m=+75.415435241" observedRunningTime="2026-04-20 11:43:08.411935303 +0000 UTC m=+75.885036390" watchObservedRunningTime="2026-04-20 11:43:08.411943673 +0000 UTC m=+75.885044763" Apr 20 11:43:09.044790 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:09.044759 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-plzm6_80706795-3e55-4fb6-9b83-da08f0522340/dns-node-resolver/0.log" Apr 20 11:43:09.245777 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:09.245747 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bdbfd9bcc-nhjwl_3b617c5e-ef25-4c2f-b279-519030aa35e0/registry/0.log" Apr 20 11:43:09.644676 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:09.644646 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kvmvq_bb716ff4-9386-4b54-8b88-2680a1fb36a1/node-ca/0.log" Apr 20 11:43:10.846210 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:10.846181 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbgjz_c665ef3f-dfe6-4608-901f-51a3bf39c346/serve-healthcheck-canary/0.log" Apr 20 11:43:11.048055 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:11.048028 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7b657_8a7500c2-cf33-4c82-888a-5ad636ca165a/migrator/0.log" Apr 20 11:43:11.244446 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:11.244370 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7b657_8a7500c2-cf33-4c82-888a-5ad636ca165a/graceful-termination/0.log" Apr 20 11:43:13.059472 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.059444 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f6qb2"] Apr 20 11:43:13.064293 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.064272 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.067459 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.067437 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 11:43:13.067582 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.067457 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 11:43:13.067582 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.067467 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rffgp\"" Apr 20 11:43:13.075564 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.075542 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6qb2"] Apr 20 11:43:13.178392 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.178360 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1e431d47-4bc8-456d-a58b-fb3263c9d358-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.178524 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.178407 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1e431d47-4bc8-456d-a58b-fb3263c9d358-crio-socket\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.178524 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.178425 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.178606 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.178517 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1e431d47-4bc8-456d-a58b-fb3263c9d358-data-volume\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.178606 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.178545 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6q7\" (UniqueName: \"kubernetes.io/projected/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-api-access-fk6q7\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279395 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279363 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1e431d47-4bc8-456d-a58b-fb3263c9d358-crio-socket\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279401 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279456 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1e431d47-4bc8-456d-a58b-fb3263c9d358-data-volume\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279482 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6q7\" (UniqueName: \"kubernetes.io/projected/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-api-access-fk6q7\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279485 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1e431d47-4bc8-456d-a58b-fb3263c9d358-crio-socket\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279516 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1e431d47-4bc8-456d-a58b-fb3263c9d358-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.279841 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279822 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1e431d47-4bc8-456d-a58b-fb3263c9d358-data-volume\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.280016 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.279985 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.282048 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.282018 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1e431d47-4bc8-456d-a58b-fb3263c9d358-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.288945 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.288922 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6q7\" (UniqueName: \"kubernetes.io/projected/1e431d47-4bc8-456d-a58b-fb3263c9d358-kube-api-access-fk6q7\") pod \"insights-runtime-extractor-f6qb2\" (UID: \"1e431d47-4bc8-456d-a58b-fb3263c9d358\") " pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.373613 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.373543 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6qb2" Apr 20 11:43:13.426491 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.426466 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:43:13.431252 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.431229 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.434376 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434309 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 11:43:13.434376 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434357 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 11:43:13.434623 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434403 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 11:43:13.434684 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434656 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 11:43:13.434765 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434364 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 11:43:13.434765 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434310 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 11:43:13.434856 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434532 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 11:43:13.434938 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.434916 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-krrnw\"" Apr 20 11:43:13.443435 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.443409 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:43:13.500736 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.500686 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6qb2"] Apr 20 11:43:13.503634 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:13.503591 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e431d47_4bc8_456d_a58b_fb3263c9d358.slice/crio-6d705d9cf0583be6be43fd9f6c81e36623531b70eda362fa2365b0d56c4dd4ee WatchSource:0}: Error finding container 6d705d9cf0583be6be43fd9f6c81e36623531b70eda362fa2365b0d56c4dd4ee: Status 404 returned error can't find the container with id 6d705d9cf0583be6be43fd9f6c81e36623531b70eda362fa2365b0d56c4dd4ee Apr 20 11:43:13.582383 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582357 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.582510 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582410 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnjs\" (UniqueName: \"kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.582510 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582499 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.582624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582531 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.582624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582564 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.582624 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.582597 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.683868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683773 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.683868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683821 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.683868 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683857 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684130 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683883 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684130 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683944 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684130 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.683993 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnjs\" (UniqueName: \"kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684606 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.684579 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684736 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.684721 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.684785 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.684723 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.686901 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.686880 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.687163 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.687145 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.692736 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.692717 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnjs\" (UniqueName: \"kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs\") pod \"console-7797d4c54d-kmn9n\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.743791 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.743761 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:13.867452 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:13.867415 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:43:13.870062 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:13.870034 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3dd1b5e_8713_4dc8_a2c3_0313bc56fec7.slice/crio-6e49e601439a930ae180f6a403dc16189d7ead2de1b706b54b405d353f3ae9ae WatchSource:0}: Error finding container 6e49e601439a930ae180f6a403dc16189d7ead2de1b706b54b405d353f3ae9ae: Status 404 returned error can't find the container with id 6e49e601439a930ae180f6a403dc16189d7ead2de1b706b54b405d353f3ae9ae Apr 20 11:43:14.416389 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.416350 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7797d4c54d-kmn9n" event={"ID":"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7","Type":"ContainerStarted","Data":"6e49e601439a930ae180f6a403dc16189d7ead2de1b706b54b405d353f3ae9ae"} Apr 20 11:43:14.418276 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.418252 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6qb2" event={"ID":"1e431d47-4bc8-456d-a58b-fb3263c9d358","Type":"ContainerStarted","Data":"2cbd68018fc03d6d7cc534ed8a5037ec22c6b12d3fac40ec4964ce0b94adc412"} Apr 20 11:43:14.418403 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.418282 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6qb2" event={"ID":"1e431d47-4bc8-456d-a58b-fb3263c9d358","Type":"ContainerStarted","Data":"756831100d54b57e0cd1d783b4fd09199c2aa4c5ffe3458592b706ba77883ec0"} Apr 20 11:43:14.418403 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.418297 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6qb2" event={"ID":"1e431d47-4bc8-456d-a58b-fb3263c9d358","Type":"ContainerStarted","Data":"6d705d9cf0583be6be43fd9f6c81e36623531b70eda362fa2365b0d56c4dd4ee"} Apr 20 11:43:14.516900 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.515225 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn"] Apr 20 11:43:14.519120 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.519094 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.523332 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.522984 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 11:43:14.523332 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.523269 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gx5w4\"" Apr 20 11:43:14.523332 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.523303 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 11:43:14.524565 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.524378 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 11:43:14.531459 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.531437 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn"] Apr 20 11:43:14.553974 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.553939 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ltnvp"] Apr 20 11:43:14.557770 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.557747 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.561230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.560779 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 11:43:14.561230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.560812 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 11:43:14.561230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.560868 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 11:43:14.561230 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.561088 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5j4wd\"" Apr 20 11:43:14.592430 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.592400 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.592587 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.592459 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.592587 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.592494 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbnr\" (UniqueName: \"kubernetes.io/projected/d729fe95-de51-488a-8d09-dcff02e454d4-kube-api-access-5dbnr\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.592779 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.592588 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d729fe95-de51-488a-8d09-dcff02e454d4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.693431 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693388 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-root\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.693613 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693463 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4dq\" (UniqueName: \"kubernetes.io/projected/4969e65e-b617-4944-affd-e915820a2349-kube-api-access-cm4dq\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.693613 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693508 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.693613 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693539 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4969e65e-b617-4944-affd-e915820a2349-node-exporter-textfile\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.693820 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693645 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d729fe95-de51-488a-8d09-dcff02e454d4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.693820 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693734 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.693820 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693770 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.693820 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693798 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.694013 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693840 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-node-exporter-wtmp\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.694013 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693873 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.694013 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693901 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbnr\" (UniqueName: \"kubernetes.io/projected/d729fe95-de51-488a-8d09-dcff02e454d4-kube-api-access-5dbnr\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.694013 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693938 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-metrics-client-ca\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.694013 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.693966 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-sys\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.694253 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:14.694116 2585 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 11:43:14.694253 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:14.694174 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls podName:d729fe95-de51-488a-8d09-dcff02e454d4 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:15.194154376 +0000 UTC m=+82.667255449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-2mmtn" (UID: "d729fe95-de51-488a-8d09-dcff02e454d4") : secret "openshift-state-metrics-tls" not found Apr 20 11:43:14.694381 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.694359 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d729fe95-de51-488a-8d09-dcff02e454d4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.697714 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.697642 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.708770 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.708613 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbnr\" (UniqueName: \"kubernetes.io/projected/d729fe95-de51-488a-8d09-dcff02e454d4-kube-api-access-5dbnr\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:14.794765 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794717 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.794954 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794785 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.794954 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794820 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-node-exporter-wtmp\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.794954 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794859 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-metrics-client-ca\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.794954 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794885 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-sys\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.794954 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794933 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-root\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.795223 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794962 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4dq\" (UniqueName: \"kubernetes.io/projected/4969e65e-b617-4944-affd-e915820a2349-kube-api-access-cm4dq\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.795223 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.794991 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.795223 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795018 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4969e65e-b617-4944-affd-e915820a2349-node-exporter-textfile\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795369 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795454 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-sys\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795574 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-node-exporter-wtmp\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795827 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4969e65e-b617-4944-affd-e915820a2349-root\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:14.795836 2585 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:14.795918 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls podName:4969e65e-b617-4944-affd-e915820a2349 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:15.295897525 +0000 UTC m=+82.768998597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls") pod "node-exporter-ltnvp" (UID: "4969e65e-b617-4944-affd-e915820a2349") : secret "node-exporter-tls" not found Apr 20 11:43:14.796044 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.795926 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4969e65e-b617-4944-affd-e915820a2349-node-exporter-textfile\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.796618 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.796581 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4969e65e-b617-4944-affd-e915820a2349-metrics-client-ca\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.800850 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.800806 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:14.806884 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:14.806861 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4dq\" (UniqueName: \"kubernetes.io/projected/4969e65e-b617-4944-affd-e915820a2349-kube-api-access-cm4dq\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:15.199064 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.198978 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:15.203241 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.203212 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d729fe95-de51-488a-8d09-dcff02e454d4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2mmtn\" (UID: \"d729fe95-de51-488a-8d09-dcff02e454d4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:15.299828 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.299793 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:15.303839 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.303808 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4969e65e-b617-4944-affd-e915820a2349-node-exporter-tls\") pod \"node-exporter-ltnvp\" (UID: \"4969e65e-b617-4944-affd-e915820a2349\") " pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:15.387420 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.387390 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cdjdz" Apr 20 11:43:15.433895 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.433857 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" Apr 20 11:43:15.470626 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.470539 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ltnvp" Apr 20 11:43:15.614667 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:15.614631 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn"] Apr 20 11:43:17.093351 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:17.093310 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd729fe95_de51_488a_8d09_dcff02e454d4.slice/crio-a7da227942ec7f645e0aec44b9f1987fb4a24eaaead5e695782e67ea81914d6f WatchSource:0}: Error finding container a7da227942ec7f645e0aec44b9f1987fb4a24eaaead5e695782e67ea81914d6f: Status 404 returned error can't find the container with id a7da227942ec7f645e0aec44b9f1987fb4a24eaaead5e695782e67ea81914d6f Apr 20 11:43:17.153024 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:17.152994 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4969e65e_b617_4944_affd_e915820a2349.slice/crio-eca56d35e454c26dc384524b294d7ca282025bf820717d2db1c8d19693354064 WatchSource:0}: Error finding container eca56d35e454c26dc384524b294d7ca282025bf820717d2db1c8d19693354064: Status 404 returned error can't find the container with id eca56d35e454c26dc384524b294d7ca282025bf820717d2db1c8d19693354064 Apr 20 11:43:17.429718 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.429613 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7797d4c54d-kmn9n" event={"ID":"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7","Type":"ContainerStarted","Data":"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b"} Apr 20 11:43:17.431285 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.431257 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" event={"ID":"d729fe95-de51-488a-8d09-dcff02e454d4","Type":"ContainerStarted","Data":"8d9fa35ae8361d2f0da3a87a3b6fba9b32410e6b053c9a02ce2854adbf86d551"} Apr 20 11:43:17.431401 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.431287 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" event={"ID":"d729fe95-de51-488a-8d09-dcff02e454d4","Type":"ContainerStarted","Data":"74b1daea5e7f5f4b22b966fa0d643c147ff2d89605fc4e6b0a1928205016df3f"} Apr 20 11:43:17.431401 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.431297 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" event={"ID":"d729fe95-de51-488a-8d09-dcff02e454d4","Type":"ContainerStarted","Data":"a7da227942ec7f645e0aec44b9f1987fb4a24eaaead5e695782e67ea81914d6f"} Apr 20 11:43:17.432297 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.432268 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltnvp" event={"ID":"4969e65e-b617-4944-affd-e915820a2349","Type":"ContainerStarted","Data":"eca56d35e454c26dc384524b294d7ca282025bf820717d2db1c8d19693354064"} Apr 20 11:43:17.434138 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.434112 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6qb2" event={"ID":"1e431d47-4bc8-456d-a58b-fb3263c9d358","Type":"ContainerStarted","Data":"c7a55a6bfd9d73078dee955ce863e29f4768c40a10a7bd9daf6b4d65e01943c2"} Apr 20 11:43:17.455878 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.455846 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7797d4c54d-kmn9n" podStartSLOduration=1.168239123 podStartE2EDuration="4.455833736s" podCreationTimestamp="2026-04-20 11:43:13 +0000 UTC" firstStartedPulling="2026-04-20 11:43:13.871896756 +0000 UTC m=+81.344997820" lastFinishedPulling="2026-04-20 11:43:17.159491352 +0000 UTC m=+84.632592433" observedRunningTime="2026-04-20 11:43:17.452759692 +0000 UTC m=+84.925860780" watchObservedRunningTime="2026-04-20 11:43:17.455833736 +0000 UTC m=+84.928934815" Apr 20 11:43:17.483203 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:17.483155 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f6qb2" podStartSLOduration=0.896692467 podStartE2EDuration="4.483139847s" podCreationTimestamp="2026-04-20 11:43:13 +0000 UTC" firstStartedPulling="2026-04-20 11:43:13.564451915 +0000 UTC m=+81.037552979" lastFinishedPulling="2026-04-20 11:43:17.150899279 +0000 UTC m=+84.624000359" observedRunningTime="2026-04-20 11:43:17.482518435 +0000 UTC m=+84.955619522" watchObservedRunningTime="2026-04-20 11:43:17.483139847 +0000 UTC m=+84.956240936" Apr 20 11:43:18.438663 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:18.438623 2585 generic.go:358] "Generic (PLEG): container finished" podID="4969e65e-b617-4944-affd-e915820a2349" containerID="152bb1cfe1d6c912b1cbf6f1de945398c5a957d4070d93ed43266d4caaf47679" exitCode=0 Apr 20 11:43:18.439136 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:18.438666 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltnvp" event={"ID":"4969e65e-b617-4944-affd-e915820a2349","Type":"ContainerDied","Data":"152bb1cfe1d6c912b1cbf6f1de945398c5a957d4070d93ed43266d4caaf47679"} Apr 20 11:43:19.057612 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.057582 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6b58ddffd5-kfq6v"] Apr 20 11:43:19.060757 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.060731 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.066666 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.066642 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 11:43:19.068115 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.068093 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 11:43:19.068246 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.068193 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a6250p1j28b04\"" Apr 20 11:43:19.068246 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.068097 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 11:43:19.068383 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.068295 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 11:43:19.068383 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.068321 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-wssjl\"" Apr 20 11:43:19.071722 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.071683 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b58ddffd5-kfq6v"] Apr 20 11:43:19.137255 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137221 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-client-certs\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137395 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137268 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1fc2020e-e650-43bd-b853-69fd2c470198-audit-log\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137395 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137294 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t6hz\" (UniqueName: \"kubernetes.io/projected/1fc2020e-e650-43bd-b853-69fd2c470198-kube-api-access-6t6hz\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137395 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137334 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-metrics-server-audit-profiles\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137521 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137409 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137521 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137444 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-tls\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.137521 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.137491 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-client-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238104 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238069 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-client-certs\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238118 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1fc2020e-e650-43bd-b853-69fd2c470198-audit-log\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238136 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t6hz\" (UniqueName: \"kubernetes.io/projected/1fc2020e-e650-43bd-b853-69fd2c470198-kube-api-access-6t6hz\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238161 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-metrics-server-audit-profiles\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238182 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238200 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-tls\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238244 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238219 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-client-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.238627 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.238600 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1fc2020e-e650-43bd-b853-69fd2c470198-audit-log\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.239324 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.239023 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.239324 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.239232 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1fc2020e-e650-43bd-b853-69fd2c470198-metrics-server-audit-profiles\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.241393 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.241367 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-client-certs\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.241484 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.241423 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-secret-metrics-server-tls\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.241854 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.241834 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2020e-e650-43bd-b853-69fd2c470198-client-ca-bundle\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.248347 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.248326 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t6hz\" (UniqueName: \"kubernetes.io/projected/1fc2020e-e650-43bd-b853-69fd2c470198-kube-api-access-6t6hz\") pod \"metrics-server-6b58ddffd5-kfq6v\" (UID: \"1fc2020e-e650-43bd-b853-69fd2c470198\") " pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.308428 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.308337 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn"] Apr 20 11:43:19.312629 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.312606 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:19.315993 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.315973 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 11:43:19.316132 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.316049 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7bs2w\"" Apr 20 11:43:19.320821 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.320794 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn"] Apr 20 11:43:19.369330 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.369291 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:19.443120 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.443081 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qncmn\" (UID: \"04d7888a-8e02-4e73-af80-956c151f0ff8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:19.454644 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.454611 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltnvp" event={"ID":"4969e65e-b617-4944-affd-e915820a2349","Type":"ContainerStarted","Data":"6c9aafa3dcdfee5e4d96431c49aed075f3c476989dc953abb4ad16c52ff8fd52"} Apr 20 11:43:19.454792 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.454655 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltnvp" event={"ID":"4969e65e-b617-4944-affd-e915820a2349","Type":"ContainerStarted","Data":"047e503acfa56b0d6995517af8e0e12a5112fdca8ade68b53126483fa745465f"} Apr 20 11:43:19.456966 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.456939 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" event={"ID":"d729fe95-de51-488a-8d09-dcff02e454d4","Type":"ContainerStarted","Data":"d3374e0c85cf75ee8a8ba2e91e5dab438b00a36cced532f132412b51039456e8"} Apr 20 11:43:19.479121 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.479063 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ltnvp" podStartSLOduration=4.711264128 podStartE2EDuration="5.479045963s" podCreationTimestamp="2026-04-20 11:43:14 +0000 UTC" firstStartedPulling="2026-04-20 11:43:17.154712414 +0000 UTC m=+84.627813478" lastFinishedPulling="2026-04-20 11:43:17.92249424 +0000 UTC m=+85.395595313" observedRunningTime="2026-04-20 11:43:19.477379109 +0000 UTC m=+86.950480198" watchObservedRunningTime="2026-04-20 11:43:19.479045963 +0000 UTC m=+86.952147049" Apr 20 11:43:19.498338 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.498277 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2mmtn" podStartSLOduration=4.247457169 podStartE2EDuration="5.498264898s" podCreationTimestamp="2026-04-20 11:43:14 +0000 UTC" firstStartedPulling="2026-04-20 11:43:17.282592952 +0000 UTC m=+84.755694020" lastFinishedPulling="2026-04-20 11:43:18.533400672 +0000 UTC m=+86.006501749" observedRunningTime="2026-04-20 11:43:19.497169608 +0000 UTC m=+86.970270695" watchObservedRunningTime="2026-04-20 11:43:19.498264898 +0000 UTC m=+86.971366022" Apr 20 11:43:19.505583 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.505545 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b58ddffd5-kfq6v"] Apr 20 11:43:19.509274 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:19.509252 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc2020e_e650_43bd_b853_69fd2c470198.slice/crio-6f4b2899c3ee55a591f5215dfdaaa0f6930bfb6a36beb9ede129a336f3422954 WatchSource:0}: Error finding container 6f4b2899c3ee55a591f5215dfdaaa0f6930bfb6a36beb9ede129a336f3422954: Status 404 returned error can't find the container with id 6f4b2899c3ee55a591f5215dfdaaa0f6930bfb6a36beb9ede129a336f3422954 Apr 20 11:43:19.544031 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.544004 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qncmn\" (UID: \"04d7888a-8e02-4e73-af80-956c151f0ff8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:19.544397 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:19.544198 2585 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 11:43:19.544397 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:19.544280 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert podName:04d7888a-8e02-4e73-af80-956c151f0ff8 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:20.04425818 +0000 UTC m=+87.517359251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-qncmn" (UID: "04d7888a-8e02-4e73-af80-956c151f0ff8") : secret "monitoring-plugin-cert" not found Apr 20 11:43:19.752007 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.751917 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-857cf956d6-dw8b7"] Apr 20 11:43:19.756815 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.756793 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.760946 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.760923 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 11:43:19.761297 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.761268 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 11:43:19.761635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.761513 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 11:43:19.761635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.761523 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 11:43:19.761635 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.761578 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-k7r2t\"" Apr 20 11:43:19.761876 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.761831 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 11:43:19.766484 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.766460 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 11:43:19.774571 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.774544 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-857cf956d6-dw8b7"] Apr 20 11:43:19.847572 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847534 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-serving-certs-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847726 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847589 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-metrics-client-ca\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847726 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847672 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-federate-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847824 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847772 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847824 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847793 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gmw\" (UniqueName: \"kubernetes.io/projected/ca478535-97ee-441f-b4d4-efd62183ae21-kube-api-access-n5gmw\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847924 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847820 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-trusted-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.847924 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847894 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.848005 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.847960 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949301 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949268 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949306 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-serving-certs-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949332 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-metrics-client-ca\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949358 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-federate-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949399 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949415 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5gmw\" (UniqueName: \"kubernetes.io/projected/ca478535-97ee-441f-b4d4-efd62183ae21-kube-api-access-n5gmw\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949435 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-trusted-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.949478 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.949469 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.950273 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.950192 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-metrics-client-ca\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.950473 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.950350 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-serving-certs-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.950550 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.950481 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-trusted-ca-bundle\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.952016 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.951991 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-federate-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.952111 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.952092 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-telemeter-client-tls\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.952320 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.952293 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.952412 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.952342 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca478535-97ee-441f-b4d4-efd62183ae21-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.964026 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.964000 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5gmw\" (UniqueName: \"kubernetes.io/projected/ca478535-97ee-441f-b4d4-efd62183ae21-kube-api-access-n5gmw\") pod \"telemeter-client-857cf956d6-dw8b7\" (UID: \"ca478535-97ee-441f-b4d4-efd62183ae21\") " pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:19.987347 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.987319 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:43:19.991188 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:19.991170 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:43:20.050426 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.050397 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qncmn\" (UID: \"04d7888a-8e02-4e73-af80-956c151f0ff8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:20.053012 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.052990 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04d7888a-8e02-4e73-af80-956c151f0ff8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qncmn\" (UID: \"04d7888a-8e02-4e73-af80-956c151f0ff8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:20.067426 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.067405 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" Apr 20 11:43:20.223717 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.223666 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:20.263480 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.262790 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-857cf956d6-dw8b7"] Apr 20 11:43:20.266606 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:20.266572 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca478535_97ee_441f_b4d4_efd62183ae21.slice/crio-9550bc751ec0b68d17a3e0f647080d88596c263658236f7891cb97e5ecc94076 WatchSource:0}: Error finding container 9550bc751ec0b68d17a3e0f647080d88596c263658236f7891cb97e5ecc94076: Status 404 returned error can't find the container with id 9550bc751ec0b68d17a3e0f647080d88596c263658236f7891cb97e5ecc94076 Apr 20 11:43:20.377742 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.377719 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn"] Apr 20 11:43:20.380414 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:20.380388 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d7888a_8e02_4e73_af80_956c151f0ff8.slice/crio-ce064bba88cce5715c6ad36a63aeaeb4284582e5dfc9aabec6b35bf5414d3707 WatchSource:0}: Error finding container ce064bba88cce5715c6ad36a63aeaeb4284582e5dfc9aabec6b35bf5414d3707: Status 404 returned error can't find the container with id ce064bba88cce5715c6ad36a63aeaeb4284582e5dfc9aabec6b35bf5414d3707 Apr 20 11:43:20.462063 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.462027 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" event={"ID":"04d7888a-8e02-4e73-af80-956c151f0ff8","Type":"ContainerStarted","Data":"ce064bba88cce5715c6ad36a63aeaeb4284582e5dfc9aabec6b35bf5414d3707"} Apr 20 11:43:20.463302 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.463270 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" event={"ID":"1fc2020e-e650-43bd-b853-69fd2c470198","Type":"ContainerStarted","Data":"6f4b2899c3ee55a591f5215dfdaaa0f6930bfb6a36beb9ede129a336f3422954"} Apr 20 11:43:20.464535 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:20.464510 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" event={"ID":"ca478535-97ee-441f-b4d4-efd62183ae21","Type":"ContainerStarted","Data":"9550bc751ec0b68d17a3e0f647080d88596c263658236f7891cb97e5ecc94076"} Apr 20 11:43:21.470404 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:21.470364 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" event={"ID":"1fc2020e-e650-43bd-b853-69fd2c470198","Type":"ContainerStarted","Data":"8aa1c5069afe7bfb48409ac47096f4fe134de774e863d53232b6ac9517cbf44d"} Apr 20 11:43:21.491673 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:21.491597 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" podStartSLOduration=1.051004588 podStartE2EDuration="2.491575936s" podCreationTimestamp="2026-04-20 11:43:19 +0000 UTC" firstStartedPulling="2026-04-20 11:43:19.511677401 +0000 UTC m=+86.984778469" lastFinishedPulling="2026-04-20 11:43:20.952248738 +0000 UTC m=+88.425349817" observedRunningTime="2026-04-20 11:43:21.488918053 +0000 UTC m=+88.962019139" watchObservedRunningTime="2026-04-20 11:43:21.491575936 +0000 UTC m=+88.964677022" Apr 20 11:43:22.478564 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:22.478524 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" event={"ID":"ca478535-97ee-441f-b4d4-efd62183ae21","Type":"ContainerStarted","Data":"74938af80b97f9a6d6866d442cabdbb94b82d41717c33802244b9b31e8d9cdf1"} Apr 20 11:43:22.480114 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:22.480085 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" event={"ID":"04d7888a-8e02-4e73-af80-956c151f0ff8","Type":"ContainerStarted","Data":"6c7a7b218977545f2260a7d1799595e495fef2cd3be039a723e5ed1adba52ca2"} Apr 20 11:43:22.480216 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:22.480123 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:22.484824 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:22.484795 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" Apr 20 11:43:22.500074 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:22.500032 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qncmn" podStartSLOduration=1.489384796 podStartE2EDuration="3.500020253s" podCreationTimestamp="2026-04-20 11:43:19 +0000 UTC" firstStartedPulling="2026-04-20 11:43:20.382624888 +0000 UTC m=+87.855725956" lastFinishedPulling="2026-04-20 11:43:22.393260335 +0000 UTC m=+89.866361413" observedRunningTime="2026-04-20 11:43:22.499119666 +0000 UTC m=+89.972220754" watchObservedRunningTime="2026-04-20 11:43:22.500020253 +0000 UTC m=+89.973121349" Apr 20 11:43:23.486199 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.486157 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" event={"ID":"ca478535-97ee-441f-b4d4-efd62183ae21","Type":"ContainerStarted","Data":"a796083a730658eacd8e0d62f2934d1bf6750beb51b752004a3a51c96896024f"} Apr 20 11:43:23.486548 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.486213 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" event={"ID":"ca478535-97ee-441f-b4d4-efd62183ae21","Type":"ContainerStarted","Data":"287a443f541a93d7e0fc676a31a87b811cc35d5266d20130328b53d3317f2b88"} Apr 20 11:43:23.521326 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.521279 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-857cf956d6-dw8b7" podStartSLOduration=1.464727133 podStartE2EDuration="4.521265384s" podCreationTimestamp="2026-04-20 11:43:19 +0000 UTC" firstStartedPulling="2026-04-20 11:43:20.269422842 +0000 UTC m=+87.742523913" lastFinishedPulling="2026-04-20 11:43:23.3259611 +0000 UTC m=+90.799062164" observedRunningTime="2026-04-20 11:43:23.516088863 +0000 UTC m=+90.989189945" watchObservedRunningTime="2026-04-20 11:43:23.521265384 +0000 UTC m=+90.994366469" Apr 20 11:43:23.744270 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.744194 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:23.744270 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.744237 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:23.749079 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:23.749058 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:24.492682 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:24.492654 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:43:26.530979 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.530950 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:43:26.534391 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.534373 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.544320 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.544295 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 11:43:26.545782 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.545757 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:43:26.713645 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713611 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.713645 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713650 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpvmn\" (UniqueName: \"kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.713905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713672 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.713905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713841 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.713905 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713883 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.714036 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713914 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.714036 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.713949 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815132 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815049 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815132 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815084 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815132 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815102 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815132 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815121 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815420 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815141 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815420 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815164 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpvmn\" (UniqueName: \"kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815420 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815184 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.815902 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815872 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.816032 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815904 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.816032 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.815951 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.816109 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.816001 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.817928 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.817909 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.817928 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.817921 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.827284 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.827261 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpvmn\" (UniqueName: \"kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn\") pod \"console-7f996b6b67-hmd7w\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.844172 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.844143 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:26.970755 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:26.970727 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:43:26.973816 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:43:26.973791 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297c05f3_c78f_4108_9852_e9ea2401a235.slice/crio-f45f010b36598de962b2d336abc0baae6e2765b9d9ccbf297d47053609cbcda4 WatchSource:0}: Error finding container f45f010b36598de962b2d336abc0baae6e2765b9d9ccbf297d47053609cbcda4: Status 404 returned error can't find the container with id f45f010b36598de962b2d336abc0baae6e2765b9d9ccbf297d47053609cbcda4 Apr 20 11:43:27.498644 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:27.498603 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f996b6b67-hmd7w" event={"ID":"297c05f3-c78f-4108-9852-e9ea2401a235","Type":"ContainerStarted","Data":"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0"} Apr 20 11:43:27.498644 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:27.498648 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f996b6b67-hmd7w" event={"ID":"297c05f3-c78f-4108-9852-e9ea2401a235","Type":"ContainerStarted","Data":"f45f010b36598de962b2d336abc0baae6e2765b9d9ccbf297d47053609cbcda4"} Apr 20 11:43:27.522233 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:27.522188 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f996b6b67-hmd7w" podStartSLOduration=1.522176377 podStartE2EDuration="1.522176377s" podCreationTimestamp="2026-04-20 11:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:43:27.520820603 +0000 UTC m=+94.993921691" watchObservedRunningTime="2026-04-20 11:43:27.522176377 +0000 UTC m=+94.995277463" Apr 20 11:43:36.844796 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:36.844755 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:36.844796 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:36.844807 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:36.849405 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:36.849381 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:37.537432 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:37.537400 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:43:37.586628 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:37.586592 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:43:39.370300 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:39.370263 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:39.370675 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:39.370311 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:45.006310 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.006243 2585 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" podUID="3b617c5e-ef25-4c2f-b279-519030aa35e0" containerName="registry" containerID="cri-o://e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce" gracePeriod=30 Apr 20 11:43:45.247552 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.247528 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:43:45.367746 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367713 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367766 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367785 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpkbw\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367803 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367824 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367869 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367904 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.367930 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.367926 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca\") pod \"3b617c5e-ef25-4c2f-b279-519030aa35e0\" (UID: \"3b617c5e-ef25-4c2f-b279-519030aa35e0\") " Apr 20 11:43:45.368372 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.368338 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:45.368627 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.368399 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:45.370397 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.370343 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:43:45.370518 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.370465 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:43:45.370801 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.370773 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:43:45.370801 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.370784 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:43:45.370944 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.370887 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw" (OuterVolumeSpecName: "kube-api-access-xpkbw") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "kube-api-access-xpkbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:43:45.376874 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.376845 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3b617c5e-ef25-4c2f-b279-519030aa35e0" (UID: "3b617c5e-ef25-4c2f-b279-519030aa35e0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:43:45.468949 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468917 2585 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-image-registry-private-configuration\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.468949 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468943 2585 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-bound-sa-token\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.468949 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468953 2585 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpkbw\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-kube-api-access-xpkbw\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.469155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468962 2585 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-tls\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.469155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468971 2585 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b617c5e-ef25-4c2f-b279-519030aa35e0-ca-trust-extracted\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.469155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468980 2585 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-registry-certificates\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.469155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468989 2585 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b617c5e-ef25-4c2f-b279-519030aa35e0-installation-pull-secrets\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.469155 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.468997 2585 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b617c5e-ef25-4c2f-b279-519030aa35e0-trusted-ca\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:43:45.557398 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.557361 2585 generic.go:358] "Generic (PLEG): container finished" podID="3b617c5e-ef25-4c2f-b279-519030aa35e0" containerID="e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce" exitCode=0 Apr 20 11:43:45.557531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.557424 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" Apr 20 11:43:45.557531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.557462 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" event={"ID":"3b617c5e-ef25-4c2f-b279-519030aa35e0","Type":"ContainerDied","Data":"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce"} Apr 20 11:43:45.557531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.557499 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl" event={"ID":"3b617c5e-ef25-4c2f-b279-519030aa35e0","Type":"ContainerDied","Data":"d5a61cd1fad0f2caa12e7f0aec82ad3870f9772d7d757903e6ed2e4148adf665"} Apr 20 11:43:45.557531 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.557513 2585 scope.go:117] "RemoveContainer" containerID="e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce" Apr 20 11:43:45.566447 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.566426 2585 scope.go:117] "RemoveContainer" containerID="e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce" Apr 20 11:43:45.566684 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:43:45.566663 2585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce\": container with ID starting with e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce not found: ID does not exist" containerID="e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce" Apr 20 11:43:45.566809 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.566756 2585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce"} err="failed to get container status \"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce\": rpc error: code = NotFound desc = could not find container \"e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce\": container with ID starting with e9972533e256883ea780eec9b6570ee0c1201d0c8a6d85af6bbbe937a8f632ce not found: ID does not exist" Apr 20 11:43:45.578979 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.578954 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:43:45.582894 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:45.582873 2585 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bdbfd9bcc-nhjwl"] Apr 20 11:43:47.054770 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:47.054734 2585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b617c5e-ef25-4c2f-b279-519030aa35e0" path="/var/lib/kubelet/pods/3b617c5e-ef25-4c2f-b279-519030aa35e0/volumes" Apr 20 11:43:59.374345 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:59.374312 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:43:59.378139 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:43:59.378115 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6b58ddffd5-kfq6v" Apr 20 11:44:02.606507 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:02.606463 2585 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7797d4c54d-kmn9n" podUID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" containerName="console" containerID="cri-o://5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b" gracePeriod=15 Apr 20 11:44:02.855609 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:02.855590 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7797d4c54d-kmn9n_f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7/console/0.log" Apr 20 11:44:02.855739 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:02.855652 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:44:03.017400 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017299 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017400 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017351 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017400 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017382 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017679 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017406 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcnjs\" (UniqueName: \"kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017679 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017430 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017679 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017450 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca\") pod \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\" (UID: \"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7\") " Apr 20 11:44:03.017902 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017876 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:03.018005 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.017978 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca" (OuterVolumeSpecName: "service-ca") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:03.018063 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.018045 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config" (OuterVolumeSpecName: "console-config") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:03.020214 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.020187 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:03.020307 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.020235 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs" (OuterVolumeSpecName: "kube-api-access-xcnjs") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "kube-api-access-xcnjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:44:03.020307 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.020261 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" (UID: "f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:03.118931 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118905 2585 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-oauth-config\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.118931 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118928 2585 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-config\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.119075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118938 2585 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-oauth-serving-cert\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.119075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118947 2585 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xcnjs\" (UniqueName: \"kubernetes.io/projected/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-kube-api-access-xcnjs\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.119075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118956 2585 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-console-serving-cert\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.119075 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.118964 2585 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7-service-ca\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:44:03.609813 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609782 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7797d4c54d-kmn9n_f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7/console/0.log" Apr 20 11:44:03.610264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609829 2585 generic.go:358] "Generic (PLEG): container finished" podID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" containerID="5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b" exitCode=2 Apr 20 11:44:03.610264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609889 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7797d4c54d-kmn9n" event={"ID":"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7","Type":"ContainerDied","Data":"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b"} Apr 20 11:44:03.610264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609923 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7797d4c54d-kmn9n" Apr 20 11:44:03.610264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609941 2585 scope.go:117] "RemoveContainer" containerID="5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b" Apr 20 11:44:03.610264 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.609927 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7797d4c54d-kmn9n" event={"ID":"f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7","Type":"ContainerDied","Data":"6e49e601439a930ae180f6a403dc16189d7ead2de1b706b54b405d353f3ae9ae"} Apr 20 11:44:03.617896 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.617875 2585 scope.go:117] "RemoveContainer" containerID="5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b" Apr 20 11:44:03.618152 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:44:03.618135 2585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b\": container with ID starting with 5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b not found: ID does not exist" containerID="5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b" Apr 20 11:44:03.618195 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.618160 2585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b"} err="failed to get container status \"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b\": rpc error: code = NotFound desc = could not find container \"5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b\": container with ID starting with 5ecaa3673fb90a33d4a01560bdf6ca6cbbb091528e881e20b8b1dfa7b311f45b not found: ID does not exist" Apr 20 11:44:03.628128 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.628106 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:44:03.631333 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:03.631313 2585 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7797d4c54d-kmn9n"] Apr 20 11:44:04.615308 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:04.615231 2585 generic.go:358] "Generic (PLEG): container finished" podID="ba008230-f452-4546-9072-0f9d9eca2357" containerID="9e7f77ac85dcc1078ee33a0a74be32ec319cfb04e4a8cca4732b8558d4aa7796" exitCode=0 Apr 20 11:44:04.615308 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:04.615274 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" event={"ID":"ba008230-f452-4546-9072-0f9d9eca2357","Type":"ContainerDied","Data":"9e7f77ac85dcc1078ee33a0a74be32ec319cfb04e4a8cca4732b8558d4aa7796"} Apr 20 11:44:04.615738 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:04.615608 2585 scope.go:117] "RemoveContainer" containerID="9e7f77ac85dcc1078ee33a0a74be32ec319cfb04e4a8cca4732b8558d4aa7796" Apr 20 11:44:05.055088 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:05.055054 2585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" path="/var/lib/kubelet/pods/f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7/volumes" Apr 20 11:44:05.619565 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:44:05.619534 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pc7s" event={"ID":"ba008230-f452-4546-9072-0f9d9eca2357","Type":"ContainerStarted","Data":"3dc9bb0d13f7e81b59fdc4d0f57b59e9d5b2b98042d6bfafe408d973462abe40"} Apr 20 11:45:04.302017 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.301986 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8549bfc8c4-kslck"] Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302321 2585 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b617c5e-ef25-4c2f-b279-519030aa35e0" containerName="registry" Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302333 2585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b617c5e-ef25-4c2f-b279-519030aa35e0" containerName="registry" Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302356 2585 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" containerName="console" Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302362 2585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" containerName="console" Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302413 2585 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3dd1b5e-8713-4dc8-a2c3-0313bc56fec7" containerName="console" Apr 20 11:45:04.302485 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.302422 2585 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b617c5e-ef25-4c2f-b279-519030aa35e0" containerName="registry" Apr 20 11:45:04.305602 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.305584 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.316554 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.316528 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8549bfc8c4-kslck"] Apr 20 11:45:04.429427 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429391 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429427 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429429 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ktr\" (UniqueName: \"kubernetes.io/projected/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-kube-api-access-m2ktr\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429630 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429453 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429630 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429507 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-oauth-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429630 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429584 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-trusted-ca-bundle\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429797 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429655 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-service-ca\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.429797 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.429717 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-oauth-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531001 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.530897 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531140 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531046 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-oauth-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531140 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531085 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-trusted-ca-bundle\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531207 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531150 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-service-ca\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531207 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531169 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-oauth-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531207 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531188 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531402 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531213 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ktr\" (UniqueName: \"kubernetes.io/projected/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-kube-api-access-m2ktr\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531626 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531595 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531928 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531870 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-service-ca\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.531928 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531895 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-oauth-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.532143 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.531992 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-trusted-ca-bundle\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.533687 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.533662 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-oauth-config\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.533687 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.533680 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-console-serving-cert\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.540533 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.540513 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ktr\" (UniqueName: \"kubernetes.io/projected/265e1013-13cd-4f67-a3d0-6705c2bf0d3d-kube-api-access-m2ktr\") pod \"console-8549bfc8c4-kslck\" (UID: \"265e1013-13cd-4f67-a3d0-6705c2bf0d3d\") " pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.615130 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.615067 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:04.754063 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.753900 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8549bfc8c4-kslck"] Apr 20 11:45:04.756950 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:45:04.756921 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod265e1013_13cd_4f67_a3d0_6705c2bf0d3d.slice/crio-93ad67f50f3ddfe13410f1ac8f08c9f81ad4e5fe5bb92975a439f3f6f02ac734 WatchSource:0}: Error finding container 93ad67f50f3ddfe13410f1ac8f08c9f81ad4e5fe5bb92975a439f3f6f02ac734: Status 404 returned error can't find the container with id 93ad67f50f3ddfe13410f1ac8f08c9f81ad4e5fe5bb92975a439f3f6f02ac734 Apr 20 11:45:04.792240 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:04.792209 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8549bfc8c4-kslck" event={"ID":"265e1013-13cd-4f67-a3d0-6705c2bf0d3d","Type":"ContainerStarted","Data":"93ad67f50f3ddfe13410f1ac8f08c9f81ad4e5fe5bb92975a439f3f6f02ac734"} Apr 20 11:45:05.796407 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:05.796366 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8549bfc8c4-kslck" event={"ID":"265e1013-13cd-4f67-a3d0-6705c2bf0d3d","Type":"ContainerStarted","Data":"80ed17c4a484a8cdee31b2952fd5d94b57e1956a35c7d9c0aeb1d019738229dc"} Apr 20 11:45:05.813983 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:05.813915 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8549bfc8c4-kslck" podStartSLOduration=1.813900876 podStartE2EDuration="1.813900876s" podCreationTimestamp="2026-04-20 11:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:45:05.813813877 +0000 UTC m=+193.286914980" watchObservedRunningTime="2026-04-20 11:45:05.813900876 +0000 UTC m=+193.287001963" Apr 20 11:45:14.615247 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:14.615204 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:14.615247 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:14.615253 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:14.620178 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:14.620154 2585 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:14.824806 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:14.824774 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8549bfc8c4-kslck" Apr 20 11:45:14.869827 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:14.869757 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:45:39.889127 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:39.889061 2585 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f996b6b67-hmd7w" podUID="297c05f3-c78f-4108-9852-e9ea2401a235" containerName="console" containerID="cri-o://588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0" gracePeriod=15 Apr 20 11:45:40.130968 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.130940 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f996b6b67-hmd7w_297c05f3-c78f-4108-9852-e9ea2401a235/console/0.log" Apr 20 11:45:40.131083 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.131010 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:45:40.230281 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230204 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230281 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230235 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230281 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230275 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230536 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230290 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230536 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230431 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230536 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230504 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpvmn\" (UniqueName: \"kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230712 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230544 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca\") pod \"297c05f3-c78f-4108-9852-e9ea2401a235\" (UID: \"297c05f3-c78f-4108-9852-e9ea2401a235\") " Apr 20 11:45:40.230712 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230647 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:45:40.230828 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230722 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config" (OuterVolumeSpecName: "console-config") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:45:40.230828 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.230768 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:45:40.231047 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.231023 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca" (OuterVolumeSpecName: "service-ca") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:45:40.231630 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.231610 2585 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-console-config\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.231734 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.231645 2585 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-trusted-ca-bundle\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.231734 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.231674 2585 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-oauth-serving-cert\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.232667 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.232641 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:45:40.232841 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.232822 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn" (OuterVolumeSpecName: "kube-api-access-vpvmn") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "kube-api-access-vpvmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:45:40.232939 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.232924 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "297c05f3-c78f-4108-9852-e9ea2401a235" (UID: "297c05f3-c78f-4108-9852-e9ea2401a235"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:45:40.332965 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.332940 2585 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpvmn\" (UniqueName: \"kubernetes.io/projected/297c05f3-c78f-4108-9852-e9ea2401a235-kube-api-access-vpvmn\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.332965 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.332965 2585 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/297c05f3-c78f-4108-9852-e9ea2401a235-service-ca\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.333118 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.332976 2585 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-oauth-config\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.333118 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.332984 2585 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/297c05f3-c78f-4108-9852-e9ea2401a235-console-serving-cert\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:45:40.893258 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893228 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f996b6b67-hmd7w_297c05f3-c78f-4108-9852-e9ea2401a235/console/0.log" Apr 20 11:45:40.893661 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893272 2585 generic.go:358] "Generic (PLEG): container finished" podID="297c05f3-c78f-4108-9852-e9ea2401a235" containerID="588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0" exitCode=2 Apr 20 11:45:40.893661 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893345 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f996b6b67-hmd7w" Apr 20 11:45:40.893661 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893359 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f996b6b67-hmd7w" event={"ID":"297c05f3-c78f-4108-9852-e9ea2401a235","Type":"ContainerDied","Data":"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0"} Apr 20 11:45:40.893661 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893400 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f996b6b67-hmd7w" event={"ID":"297c05f3-c78f-4108-9852-e9ea2401a235","Type":"ContainerDied","Data":"f45f010b36598de962b2d336abc0baae6e2765b9d9ccbf297d47053609cbcda4"} Apr 20 11:45:40.893661 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.893421 2585 scope.go:117] "RemoveContainer" containerID="588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0" Apr 20 11:45:40.901887 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.901872 2585 scope.go:117] "RemoveContainer" containerID="588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0" Apr 20 11:45:40.902113 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:45:40.902095 2585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0\": container with ID starting with 588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0 not found: ID does not exist" containerID="588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0" Apr 20 11:45:40.902167 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.902126 2585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0"} err="failed to get container status \"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0\": rpc error: code = NotFound desc = could not find container \"588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0\": container with ID starting with 588f79af80dbe6e1711304406dffdd3f1ab77aaa9339b4c02f3bd61291274fd0 not found: ID does not exist" Apr 20 11:45:40.915557 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.915527 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:45:40.920021 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:40.920002 2585 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f996b6b67-hmd7w"] Apr 20 11:45:41.055045 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:45:41.055013 2585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="297c05f3-c78f-4108-9852-e9ea2401a235" path="/var/lib/kubelet/pods/297c05f3-c78f-4108-9852-e9ea2401a235/volumes" Apr 20 11:46:07.725757 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.725717 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vmvph/must-gather-rlvmx"] Apr 20 11:46:07.726227 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.726165 2585 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="297c05f3-c78f-4108-9852-e9ea2401a235" containerName="console" Apr 20 11:46:07.726227 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.726183 2585 state_mem.go:107] "Deleted CPUSet assignment" podUID="297c05f3-c78f-4108-9852-e9ea2401a235" containerName="console" Apr 20 11:46:07.726343 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.726262 2585 memory_manager.go:356] "RemoveStaleState removing state" podUID="297c05f3-c78f-4108-9852-e9ea2401a235" containerName="console" Apr 20 11:46:07.730972 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.730950 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.734313 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.734286 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vmvph\"/\"openshift-service-ca.crt\"" Apr 20 11:46:07.734443 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.734422 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vmvph\"/\"default-dockercfg-tpl68\"" Apr 20 11:46:07.735864 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.735842 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vmvph\"/\"kube-root-ca.crt\"" Apr 20 11:46:07.738170 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.738147 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vmvph/must-gather-rlvmx"] Apr 20 11:46:07.760671 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.760646 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f969b\" (UniqueName: \"kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.760815 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.760677 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.861568 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.861537 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f969b\" (UniqueName: \"kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.861568 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.861570 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.861944 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.861923 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:07.870442 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:07.870424 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f969b\" (UniqueName: \"kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b\") pod \"must-gather-rlvmx\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:08.057358 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:08.057324 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:08.178712 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:08.178673 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vmvph/must-gather-rlvmx"] Apr 20 11:46:08.181073 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:46:08.181042 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf6bbb2_4b37_49c7_b429_3d4aacafcdfa.slice/crio-7ac27f228d65557a35dd11b4f3a22791c4654bea8128fe3428aed275aa2e5e2e WatchSource:0}: Error finding container 7ac27f228d65557a35dd11b4f3a22791c4654bea8128fe3428aed275aa2e5e2e: Status 404 returned error can't find the container with id 7ac27f228d65557a35dd11b4f3a22791c4654bea8128fe3428aed275aa2e5e2e Apr 20 11:46:08.980014 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:08.979919 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vmvph/must-gather-rlvmx" event={"ID":"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa","Type":"ContainerStarted","Data":"7ac27f228d65557a35dd11b4f3a22791c4654bea8128fe3428aed275aa2e5e2e"} Apr 20 11:46:12.994007 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:12.993870 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vmvph/must-gather-rlvmx" event={"ID":"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa","Type":"ContainerStarted","Data":"230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e"} Apr 20 11:46:12.994007 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:12.993923 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vmvph/must-gather-rlvmx" event={"ID":"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa","Type":"ContainerStarted","Data":"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96"} Apr 20 11:46:13.013091 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:13.013045 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vmvph/must-gather-rlvmx" podStartSLOduration=1.802652983 podStartE2EDuration="6.013031074s" podCreationTimestamp="2026-04-20 11:46:07 +0000 UTC" firstStartedPulling="2026-04-20 11:46:08.182803013 +0000 UTC m=+255.655904076" lastFinishedPulling="2026-04-20 11:46:12.393181087 +0000 UTC m=+259.866282167" observedRunningTime="2026-04-20 11:46:13.011129198 +0000 UTC m=+260.484230294" watchObservedRunningTime="2026-04-20 11:46:13.013031074 +0000 UTC m=+260.486132159" Apr 20 11:46:21.017248 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.017214 2585 generic.go:358] "Generic (PLEG): container finished" podID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerID="6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96" exitCode=0 Apr 20 11:46:21.017657 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.017297 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vmvph/must-gather-rlvmx" event={"ID":"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa","Type":"ContainerDied","Data":"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96"} Apr 20 11:46:21.017657 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.017616 2585 scope.go:117] "RemoveContainer" containerID="6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96" Apr 20 11:46:21.135316 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.135275 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vmvph_must-gather-rlvmx_ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa/gather/0.log" Apr 20 11:46:21.606752 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.606655 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9p767/must-gather-pjc7n"] Apr 20 11:46:21.609865 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.609845 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.612944 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.612920 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"kube-root-ca.crt\"" Apr 20 11:46:21.613041 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.612920 2585 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"openshift-service-ca.crt\"" Apr 20 11:46:21.613041 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.612923 2585 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9p767\"/\"default-dockercfg-dcczl\"" Apr 20 11:46:21.618292 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.618268 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/must-gather-pjc7n"] Apr 20 11:46:21.674832 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.674803 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/72f63ee9-7c2d-4149-8902-4475b1b5e475-must-gather-output\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.674988 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.674861 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rs7j\" (UniqueName: \"kubernetes.io/projected/72f63ee9-7c2d-4149-8902-4475b1b5e475-kube-api-access-6rs7j\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.775806 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.775768 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rs7j\" (UniqueName: \"kubernetes.io/projected/72f63ee9-7c2d-4149-8902-4475b1b5e475-kube-api-access-6rs7j\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.776011 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.775842 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/72f63ee9-7c2d-4149-8902-4475b1b5e475-must-gather-output\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.776139 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.776123 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/72f63ee9-7c2d-4149-8902-4475b1b5e475-must-gather-output\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.784466 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.784438 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rs7j\" (UniqueName: \"kubernetes.io/projected/72f63ee9-7c2d-4149-8902-4475b1b5e475-kube-api-access-6rs7j\") pod \"must-gather-pjc7n\" (UID: \"72f63ee9-7c2d-4149-8902-4475b1b5e475\") " pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:21.919622 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:21.919543 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/must-gather-pjc7n" Apr 20 11:46:22.040356 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:22.040326 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/must-gather-pjc7n"] Apr 20 11:46:22.043202 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:46:22.043176 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72f63ee9_7c2d_4149_8902_4475b1b5e475.slice/crio-19fea4b4a986310d1240a686e30e13427fed4c0a9c43e59c9fe2c5db136ce928 WatchSource:0}: Error finding container 19fea4b4a986310d1240a686e30e13427fed4c0a9c43e59c9fe2c5db136ce928: Status 404 returned error can't find the container with id 19fea4b4a986310d1240a686e30e13427fed4c0a9c43e59c9fe2c5db136ce928 Apr 20 11:46:23.024205 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:23.024179 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/must-gather-pjc7n" event={"ID":"72f63ee9-7c2d-4149-8902-4475b1b5e475","Type":"ContainerStarted","Data":"19fea4b4a986310d1240a686e30e13427fed4c0a9c43e59c9fe2c5db136ce928"} Apr 20 11:46:24.030135 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.030085 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/must-gather-pjc7n" event={"ID":"72f63ee9-7c2d-4149-8902-4475b1b5e475","Type":"ContainerStarted","Data":"9b961f4b8ef1abf88b4519c0985f8cd01dbf5fcc339d30bca7af0018fbde247b"} Apr 20 11:46:24.030135 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.030141 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/must-gather-pjc7n" event={"ID":"72f63ee9-7c2d-4149-8902-4475b1b5e475","Type":"ContainerStarted","Data":"7dfc830165972d0ddb9e97f268c473334af5e3b92820f5c7b9102dd0cf34edb5"} Apr 20 11:46:24.049309 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.049257 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9p767/must-gather-pjc7n" podStartSLOduration=2.177910832 podStartE2EDuration="3.049241108s" podCreationTimestamp="2026-04-20 11:46:21 +0000 UTC" firstStartedPulling="2026-04-20 11:46:22.044892239 +0000 UTC m=+269.517993303" lastFinishedPulling="2026-04-20 11:46:22.916222508 +0000 UTC m=+270.389323579" observedRunningTime="2026-04-20 11:46:24.047551008 +0000 UTC m=+271.520652088" watchObservedRunningTime="2026-04-20 11:46:24.049241108 +0000 UTC m=+271.522342219" Apr 20 11:46:24.235535 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.235500 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-48wpt_07c8c3fc-2976-4ee9-904f-92f6a777b537/global-pull-secret-syncer/0.log" Apr 20 11:46:24.369261 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.369182 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-92zb6_5e22d374-f2bf-4a0f-8b5f-a2396ea20c95/konnectivity-agent/0.log" Apr 20 11:46:24.479159 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:24.479128 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-26.ec2.internal_1a9153efe987700267f82546f061485e/haproxy/0.log" Apr 20 11:46:26.463525 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.463473 2585 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vmvph/must-gather-rlvmx"] Apr 20 11:46:26.464037 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.463803 2585 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vmvph/must-gather-rlvmx" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="copy" containerID="cri-o://230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e" gracePeriod=2 Apr 20 11:46:26.470416 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.470254 2585 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vmvph/must-gather-rlvmx"] Apr 20 11:46:26.822713 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.819465 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vmvph_must-gather-rlvmx_ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa/copy/0.log" Apr 20 11:46:26.822713 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.819906 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:26.822975 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.822727 2585 status_manager.go:895] "Failed to get status for pod" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" pod="openshift-must-gather-vmvph/must-gather-rlvmx" err="pods \"must-gather-rlvmx\" is forbidden: User \"system:node:ip-10-0-141-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vmvph\": no relationship found between node 'ip-10-0-141-26.ec2.internal' and this object" Apr 20 11:46:26.938549 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.937879 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output\") pod \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " Apr 20 11:46:26.938549 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.937986 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f969b\" (UniqueName: \"kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b\") pod \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\" (UID: \"ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa\") " Apr 20 11:46:26.939317 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.939242 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" (UID: "ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:46:26.950268 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:26.950209 2585 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b" (OuterVolumeSpecName: "kube-api-access-f969b") pod "ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" (UID: "ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa"). InnerVolumeSpecName "kube-api-access-f969b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:46:27.040042 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.039950 2585 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-must-gather-output\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:46:27.040042 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.039985 2585 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f969b\" (UniqueName: \"kubernetes.io/projected/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa-kube-api-access-f969b\") on node \"ip-10-0-141-26.ec2.internal\" DevicePath \"\"" Apr 20 11:46:27.049791 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.045705 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vmvph_must-gather-rlvmx_ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa/copy/0.log" Apr 20 11:46:27.049791 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.046083 2585 generic.go:358] "Generic (PLEG): container finished" podID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerID="230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e" exitCode=143 Apr 20 11:46:27.049791 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.046196 2585 scope.go:117] "RemoveContainer" containerID="230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e" Apr 20 11:46:27.049791 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.046326 2585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vmvph/must-gather-rlvmx" Apr 20 11:46:27.052947 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.052904 2585 status_manager.go:895] "Failed to get status for pod" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" pod="openshift-must-gather-vmvph/must-gather-rlvmx" err="pods \"must-gather-rlvmx\" is forbidden: User \"system:node:ip-10-0-141-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vmvph\": no relationship found between node 'ip-10-0-141-26.ec2.internal' and this object" Apr 20 11:46:27.064881 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.063035 2585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" path="/var/lib/kubelet/pods/ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa/volumes" Apr 20 11:46:27.071209 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.071189 2585 scope.go:117] "RemoveContainer" containerID="6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96" Apr 20 11:46:27.091913 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.091881 2585 scope.go:117] "RemoveContainer" containerID="230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e" Apr 20 11:46:27.093866 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:46:27.093835 2585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e\": container with ID starting with 230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e not found: ID does not exist" containerID="230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e" Apr 20 11:46:27.094054 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.094027 2585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e"} err="failed to get container status \"230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e\": rpc error: code = NotFound desc = could not find container \"230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e\": container with ID starting with 230f8fe291e74585cd5aab705f4bd0403f0af71e2650383ecd6d205922d7d16e not found: ID does not exist" Apr 20 11:46:27.094169 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.094157 2585 scope.go:117] "RemoveContainer" containerID="6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96" Apr 20 11:46:27.094646 ip-10-0-141-26 kubenswrapper[2585]: E0420 11:46:27.094613 2585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96\": container with ID starting with 6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96 not found: ID does not exist" containerID="6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96" Apr 20 11:46:27.094773 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.094670 2585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96"} err="failed to get container status \"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96\": rpc error: code = NotFound desc = could not find container \"6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96\": container with ID starting with 6d6c19238a4595a762aebd5ccb86b81b66bf91f147c82dfe397f31c285eb3e96 not found: ID does not exist" Apr 20 11:46:27.936443 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:27.936399 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-xvfqc_6af6cd58-fc4f-4628-8259-91d3ffbbcea7/cluster-monitoring-operator/0.log" Apr 20 11:46:28.037904 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.037863 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6b58ddffd5-kfq6v_1fc2020e-e650-43bd-b853-69fd2c470198/metrics-server/0.log" Apr 20 11:46:28.066268 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.066237 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-qncmn_04d7888a-8e02-4e73-af80-956c151f0ff8/monitoring-plugin/0.log" Apr 20 11:46:28.108527 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.108489 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltnvp_4969e65e-b617-4944-affd-e915820a2349/node-exporter/0.log" Apr 20 11:46:28.139462 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.139430 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltnvp_4969e65e-b617-4944-affd-e915820a2349/kube-rbac-proxy/0.log" Apr 20 11:46:28.169634 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.169609 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltnvp_4969e65e-b617-4944-affd-e915820a2349/init-textfile/0.log" Apr 20 11:46:28.345548 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.345520 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2mmtn_d729fe95-de51-488a-8d09-dcff02e454d4/kube-rbac-proxy-main/0.log" Apr 20 11:46:28.367060 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.367029 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2mmtn_d729fe95-de51-488a-8d09-dcff02e454d4/kube-rbac-proxy-self/0.log" Apr 20 11:46:28.384643 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.384599 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2mmtn_d729fe95-de51-488a-8d09-dcff02e454d4/openshift-state-metrics/0.log" Apr 20 11:46:28.636603 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.636465 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vgkcl_d6aab5ee-7cd7-4b3e-8f13-1f59029ad976/prometheus-operator-admission-webhook/0.log" Apr 20 11:46:28.673193 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.673161 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-857cf956d6-dw8b7_ca478535-97ee-441f-b4d4-efd62183ae21/telemeter-client/0.log" Apr 20 11:46:28.695919 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.695889 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-857cf956d6-dw8b7_ca478535-97ee-441f-b4d4-efd62183ae21/reload/0.log" Apr 20 11:46:28.726480 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:28.726452 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-857cf956d6-dw8b7_ca478535-97ee-441f-b4d4-efd62183ae21/kube-rbac-proxy/0.log" Apr 20 11:46:29.868895 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:29.868861 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6rsg9_559eea80-d496-4a24-9a29-8c322f86b200/networking-console-plugin/0.log" Apr 20 11:46:30.611420 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611373 2585 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v"] Apr 20 11:46:30.611757 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611740 2585 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="gather" Apr 20 11:46:30.611757 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611756 2585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="gather" Apr 20 11:46:30.611933 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611777 2585 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="copy" Apr 20 11:46:30.611933 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611782 2585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="copy" Apr 20 11:46:30.611933 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611840 2585 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="gather" Apr 20 11:46:30.611933 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.611848 2585 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddf6bbb2-4b37-49c7-b429-3d4aacafcdfa" containerName="copy" Apr 20 11:46:30.615498 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.615477 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.629840 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.629815 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v"] Apr 20 11:46:30.630447 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.630415 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8549bfc8c4-kslck_265e1013-13cd-4f67-a3d0-6705c2bf0d3d/console/0.log" Apr 20 11:46:30.674460 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.674426 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg9qq\" (UniqueName: \"kubernetes.io/projected/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-kube-api-access-kg9qq\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.674614 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.674497 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-sys\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.674614 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.674532 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-proc\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.674614 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.674553 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-lib-modules\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.674614 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.674575 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-podres\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775314 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775283 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-sys\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775314 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775320 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-proc\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775341 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-lib-modules\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775357 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-podres\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775403 2585 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kg9qq\" (UniqueName: \"kubernetes.io/projected/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-kube-api-access-kg9qq\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775429 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-proc\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775429 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-sys\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775482 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-lib-modules\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.775542 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.775511 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-podres\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.785179 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.785156 2585 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg9qq\" (UniqueName: \"kubernetes.io/projected/4b2e8e1a-2d2e-48df-90a8-daaa6d117560-kube-api-access-kg9qq\") pod \"perf-node-gather-daemonset-4sr6v\" (UID: \"4b2e8e1a-2d2e-48df-90a8-daaa6d117560\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:30.927507 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:30.926992 2585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:31.074932 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:31.074464 2585 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v"] Apr 20 11:46:31.079719 ip-10-0-141-26 kubenswrapper[2585]: W0420 11:46:31.077796 2585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b2e8e1a_2d2e_48df_90a8_daaa6d117560.slice/crio-a860a4b24489aa0ff96d51e06382d5d8a688483f838ccb2405d33a139dad98c5 WatchSource:0}: Error finding container a860a4b24489aa0ff96d51e06382d5d8a688483f838ccb2405d33a139dad98c5: Status 404 returned error can't find the container with id a860a4b24489aa0ff96d51e06382d5d8a688483f838ccb2405d33a139dad98c5 Apr 20 11:46:31.678076 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:31.678048 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjdz_920f60a6-ca33-484b-b844-63588b7c2913/dns/0.log" Apr 20 11:46:31.718059 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:31.718011 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjdz_920f60a6-ca33-484b-b844-63588b7c2913/kube-rbac-proxy/0.log" Apr 20 11:46:31.918009 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:31.917980 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-plzm6_80706795-3e55-4fb6-9b83-da08f0522340/dns-node-resolver/0.log" Apr 20 11:46:32.066315 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:32.066277 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" event={"ID":"4b2e8e1a-2d2e-48df-90a8-daaa6d117560","Type":"ContainerStarted","Data":"d89eea7ed0da6553182fbe52ea07dbf7f17a0041cee07dd1ef472404cb2c7caf"} Apr 20 11:46:32.066315 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:32.066316 2585 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" event={"ID":"4b2e8e1a-2d2e-48df-90a8-daaa6d117560","Type":"ContainerStarted","Data":"a860a4b24489aa0ff96d51e06382d5d8a688483f838ccb2405d33a139dad98c5"} Apr 20 11:46:32.066766 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:32.066418 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:32.094843 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:32.094797 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" podStartSLOduration=2.094780942 podStartE2EDuration="2.094780942s" podCreationTimestamp="2026-04-20 11:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:46:32.091892487 +0000 UTC m=+279.564993574" watchObservedRunningTime="2026-04-20 11:46:32.094780942 +0000 UTC m=+279.567882027" Apr 20 11:46:32.342610 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:32.342533 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kvmvq_bb716ff4-9386-4b54-8b88-2680a1fb36a1/node-ca/0.log" Apr 20 11:46:33.448815 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:33.448787 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbgjz_c665ef3f-dfe6-4608-901f-51a3bf39c346/serve-healthcheck-canary/0.log" Apr 20 11:46:33.772553 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:33.772473 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4hfdx_4f3fdecc-690d-48d0-95a7-c7427f0f366b/insights-operator/0.log" Apr 20 11:46:33.793647 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:33.793616 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6qb2_1e431d47-4bc8-456d-a58b-fb3263c9d358/kube-rbac-proxy/0.log" Apr 20 11:46:33.817339 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:33.817311 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6qb2_1e431d47-4bc8-456d-a58b-fb3263c9d358/exporter/0.log" Apr 20 11:46:33.838990 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:33.838967 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6qb2_1e431d47-4bc8-456d-a58b-fb3263c9d358/extractor/0.log" Apr 20 11:46:37.563184 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:37.563154 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7b657_8a7500c2-cf33-4c82-888a-5ad636ca165a/migrator/0.log" Apr 20 11:46:37.578499 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:37.578476 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7b657_8a7500c2-cf33-4c82-888a-5ad636ca165a/graceful-termination/0.log" Apr 20 11:46:38.079458 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.079431 2585 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-4sr6v" Apr 20 11:46:38.584809 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.584777 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4m7pn_88e132ed-bc16-4c9e-a2a8-1f11c7217cd0/kube-multus/0.log" Apr 20 11:46:38.606271 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.606244 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/kube-multus-additional-cni-plugins/0.log" Apr 20 11:46:38.623319 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.623292 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/egress-router-binary-copy/0.log" Apr 20 11:46:38.641376 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.641349 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/cni-plugins/0.log" Apr 20 11:46:38.657435 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.657410 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/bond-cni-plugin/0.log" Apr 20 11:46:38.674285 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.674256 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/routeoverride-cni/0.log" Apr 20 11:46:38.691631 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.691599 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/whereabouts-cni-bincopy/0.log" Apr 20 11:46:38.708765 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:38.708741 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2kbqc_7648da60-df45-45eb-92a1-f0b097849361/whereabouts-cni/0.log" Apr 20 11:46:39.147668 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:39.147631 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gl2dq_3f86abc4-981a-497f-8da8-2b998417e124/network-metrics-daemon/0.log" Apr 20 11:46:39.163638 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:39.163608 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gl2dq_3f86abc4-981a-497f-8da8-2b998417e124/kube-rbac-proxy/0.log" Apr 20 11:46:40.553177 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.553147 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-controller/0.log" Apr 20 11:46:40.569549 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.569508 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/0.log" Apr 20 11:46:40.572341 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.572319 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovn-acl-logging/1.log" Apr 20 11:46:40.589844 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.589824 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/kube-rbac-proxy-node/0.log" Apr 20 11:46:40.610860 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.610839 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 11:46:40.627332 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.627306 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/northd/0.log" Apr 20 11:46:40.644454 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.644437 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/nbdb/0.log" Apr 20 11:46:40.661737 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.661710 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/sbdb/0.log" Apr 20 11:46:40.852388 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:40.852312 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6c62_9a0e8a26-b424-497e-b30b-d497b9949b05/ovnkube-controller/0.log" Apr 20 11:46:41.809837 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:41.809805 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lg2zf_ccc123a9-d94a-4fd5-9328-4df77651bc0e/check-endpoints/0.log" Apr 20 11:46:41.852725 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:41.852680 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mrrxd_bceb7c3e-9a84-4f27-8b25-3497e4f2353e/network-check-target-container/0.log" Apr 20 11:46:42.595438 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:42.595411 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gnwr5_2b81064b-5a70-4705-b8d7-bf578249b1ec/iptables-alerter/0.log" Apr 20 11:46:43.218611 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:43.218587 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hmhcw_fd307f40-3318-4b65-b92c-eced354114fd/tuned/0.log" Apr 20 11:46:44.761025 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:44.760990 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-xlb7l_cb4b3505-fd96-4a34-9b8a-35b95c2afdec/cluster-samples-operator/0.log" Apr 20 11:46:44.776026 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:44.776000 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-xlb7l_cb4b3505-fd96-4a34-9b8a-35b95c2afdec/cluster-samples-operator-watch/0.log" Apr 20 11:46:45.601673 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:45.601629 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-9pc7s_ba008230-f452-4546-9072-0f9d9eca2357/service-ca-operator/1.log" Apr 20 11:46:45.603552 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:45.603530 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-9pc7s_ba008230-f452-4546-9072-0f9d9eca2357/service-ca-operator/0.log" Apr 20 11:46:45.884258 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:45.884183 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-ljhs9_cfeb0647-4980-4ec3-8246-117ecbebb052/service-ca-controller/0.log" Apr 20 11:46:46.261055 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:46.260978 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-wt4qp_8e0954af-e279-48c2-8485-2e1a2c5da32f/csi-driver/0.log" Apr 20 11:46:46.279924 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:46.279893 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-wt4qp_8e0954af-e279-48c2-8485-2e1a2c5da32f/csi-node-driver-registrar/0.log" Apr 20 11:46:46.297004 ip-10-0-141-26 kubenswrapper[2585]: I0420 11:46:46.296970 2585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-wt4qp_8e0954af-e279-48c2-8485-2e1a2c5da32f/csi-liveness-probe/0.log"