Apr 23 16:31:56.576790 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:31:56.576804 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:31:56.576814 ip-10-0-133-231 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:31:56.577170 ip-10-0-133-231 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:06.721605 ip-10-0-133-231 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:06.721621 ip-10-0-133-231 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 183b4d435b4948498b1433dd87d52f01 -- Apr 23 16:34:06.296179 ip-10-0-133-231 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:34:06.795079 ip-10-0-133-231 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:06.795079 ip-10-0-133-231 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:34:06.795079 ip-10-0-133-231 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:06.795079 ip-10-0-133-231 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:34:06.795079 ip-10-0-133-231 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:06.796976 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.796868 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802618 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802636 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802640 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802644 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802647 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:06.802641 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802650 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802653 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802656 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802659 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802665 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802671 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802674 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802677 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802679 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802682 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802685 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802687 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802690 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802693 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802696 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802698 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802701 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802706 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802709 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:06.802862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802711 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802714 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802717 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802719 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802722 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802724 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802727 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802730 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802734 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802738 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802741 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802747 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802749 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802752 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802756 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802760 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802762 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802765 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802768 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802771 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:06.803379 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802773 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802776 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802780 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802786 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802790 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802792 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802795 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802797 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802800 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802803 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802805 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802808 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802811 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802814 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802816 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802821 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802824 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802827 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802829 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:06.803877 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802832 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802835 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802837 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802840 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802843 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802846 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802848 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802851 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802854 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802859 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802862 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802865 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802868 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802871 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802874 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802878 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802881 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802883 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802886 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802889 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:06.804374 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802891 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802897 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.802899 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803506 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803512 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803515 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803517 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803521 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803523 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803526 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803529 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803534 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803537 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803540 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803542 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803545 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803548 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803551 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803553 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803557 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:06.804875 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803559 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803562 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803564 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803569 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803572 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803585 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803588 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803591 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803594 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803597 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803600 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803602 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803605 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803607 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803610 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803613 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803618 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803620 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803623 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803626 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:06.805384 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803629 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803631 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803634 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803637 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803642 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803646 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803649 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803654 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803657 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803660 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803663 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803665 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803668 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803670 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803673 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803677 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803681 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803684 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803687 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:06.805931 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803690 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803695 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803699 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803701 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803704 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803707 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803709 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803712 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803714 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803717 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803720 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803722 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803726 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803731 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803733 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803736 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803740 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803742 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803745 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803748 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:06.806401 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803750 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803773 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803777 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803780 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803783 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803786 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803791 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803794 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803798 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.803800 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805878 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805902 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805911 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805931 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805936 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805939 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805944 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805952 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805955 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805959 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805962 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:34:06.806894 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805966 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805969 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805972 2572 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805976 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805979 2572 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805982 2572 flags.go:64] FLAG: --cloud-config="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805985 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805988 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805992 2572 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805995 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.805999 2572 flags.go:64] FLAG: --config-dir="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806001 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806005 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806009 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806011 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806015 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806018 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806021 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806024 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806027 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806031 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806034 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806039 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806042 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806045 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:34:06.807450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806048 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806051 2572 flags.go:64] FLAG: --enable-server="true" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806054 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806059 2572 flags.go:64] FLAG: --event-burst="100" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806062 2572 flags.go:64] FLAG: --event-qps="50" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806065 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806068 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806071 2572 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806075 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806078 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806081 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806084 2572 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806087 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806090 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806093 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806096 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806099 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806102 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806104 2572 flags.go:64] FLAG: --feature-gates="" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806108 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806112 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806115 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806118 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806122 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806125 2572 flags.go:64] FLAG: --help="false" Apr 23 16:34:06.808084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806128 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-133-231.ec2.internal" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806131 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806134 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806137 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806142 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806146 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806149 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806152 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806155 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806160 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806163 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806167 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806169 2572 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806172 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806175 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806179 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806181 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806184 2572 flags.go:64] FLAG: --lock-file="" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806187 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806190 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806193 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806199 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806201 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806204 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:34:06.808706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806207 2572 flags.go:64] FLAG: --logging-format="text" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806210 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806213 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806216 2572 flags.go:64] FLAG: --manifest-url="" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806219 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806225 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806228 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806231 2572 flags.go:64] FLAG: --max-pods="110" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806234 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806238 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806240 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806244 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806248 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806251 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806254 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806262 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806265 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806267 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806272 2572 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806275 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806281 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806284 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806287 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806290 2572 flags.go:64] FLAG: --port="10250" Apr 23 16:34:06.809304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806293 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806296 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-027af839607c50934" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806300 2572 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806303 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806306 2572 flags.go:64] FLAG: --register-node="true" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806308 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806311 2572 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806315 2572 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806318 2572 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806321 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806324 2572 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806328 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806331 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806334 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806337 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806340 2572 flags.go:64] FLAG: --runonce="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806343 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806346 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806349 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806352 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806355 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806358 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806361 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806365 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806367 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806370 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:34:06.809878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806373 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806376 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806379 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806382 2572 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806385 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806391 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806394 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806397 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806401 2572 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806404 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806407 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806410 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806413 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806416 2572 flags.go:64] FLAG: --v="2" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806421 2572 flags.go:64] FLAG: --version="false" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806425 2572 flags.go:64] FLAG: --vmodule="" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806429 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806432 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806544 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806548 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806551 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806554 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806558 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:06.810619 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806561 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806563 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806566 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806570 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806573 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806575 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806578 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806580 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806583 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806586 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806588 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806591 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806593 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806596 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806598 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806601 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806604 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806606 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806609 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806612 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:06.811206 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806614 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806616 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806619 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806622 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806625 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806627 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806630 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806632 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806635 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806638 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806640 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806642 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806645 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806647 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806650 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806652 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806655 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806657 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806660 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:06.811745 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806662 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806667 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806670 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806674 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806678 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806681 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806684 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806687 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806689 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806692 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806694 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806697 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806699 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806702 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806704 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806707 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806710 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806712 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806715 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:06.812216 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806717 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806719 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806722 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806725 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806727 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806730 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806732 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806735 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806740 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806742 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806745 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806748 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806751 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806753 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806756 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806758 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806761 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806763 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806766 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:06.812730 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806769 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:06.813227 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806772 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:06.813227 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806774 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:06.813227 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.806777 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:06.813227 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.806782 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:06.813470 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.813451 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:34:06.813503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.813470 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813514 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813519 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813522 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813526 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813529 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813532 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:06.813533 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813535 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813538 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813541 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813544 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813547 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813550 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813552 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813555 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813558 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813561 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813564 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813566 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813569 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813572 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813575 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813578 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813580 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813583 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813586 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:06.813706 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813588 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813591 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813593 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813596 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813599 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813601 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813605 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813607 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813610 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813612 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813615 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813617 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813620 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813622 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813624 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813628 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813631 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813634 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813636 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813638 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:06.814186 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813641 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813643 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813646 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813649 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813652 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813655 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813657 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813660 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813662 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813665 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813667 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813670 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813672 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813675 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813677 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813680 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813682 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813685 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813689 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813693 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:06.814673 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813697 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813700 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813703 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813706 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813710 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813713 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813716 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813719 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813722 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813725 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813728 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813730 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813733 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813735 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813738 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813740 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813743 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813746 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813748 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813750 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:06.815211 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813753 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.813758 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813848 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813852 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813855 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813858 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813861 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813864 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813866 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813869 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813872 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813874 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813877 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813880 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813883 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:06.815713 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813886 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813889 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813891 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813894 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813897 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813899 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813902 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813904 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813907 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813909 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813927 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813931 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813937 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813942 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813945 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813948 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813951 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813954 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813957 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:06.816104 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813959 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813962 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813965 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813967 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813970 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813972 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813975 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813978 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813980 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813983 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813985 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813988 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813991 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813994 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813996 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.813999 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814001 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814003 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814006 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814009 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:06.816575 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814011 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814014 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814016 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814019 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814021 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814024 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814026 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814028 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814031 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814033 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814036 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814038 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814042 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814046 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814048 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814051 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814053 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814056 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814058 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814061 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:06.817082 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814063 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814066 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814068 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814071 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814073 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814076 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814078 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814081 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814083 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814086 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814088 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814091 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814093 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:06.814096 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.814101 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:06.817627 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.814965 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:34:06.818020 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.817392 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:34:06.818569 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.818558 2572 server.go:1019] "Starting client certificate rotation" Apr 23 16:34:06.818665 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.818649 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:06.818705 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.818688 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:06.846534 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.846517 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:06.852304 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.852283 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:06.867272 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.867257 2572 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:34:06.874592 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.874580 2572 log.go:25] "Validated CRI v1 image API" Apr 23 16:34:06.878007 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.877991 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:34:06.882412 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.882393 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8fa7e8bc-8416-4e7a-ba4a-e7ea7b0adbaf:/dev/nvme0n1p3 c2ea3c86-5095-41ac-84dc-0cfdaa5ef2df:/dev/nvme0n1p4] Apr 23 16:34:06.882473 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.882412 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:34:06.883611 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.883478 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:06.888283 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.888173 2572 manager.go:217] Machine: {Timestamp:2026-04-23 16:34:06.88606725 +0000 UTC m=+0.458580162 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3109521 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26dea95e1a78fb6613e93dae7182b3 SystemUUID:ec26dea9-5e1a-78fb-6613-e93dae7182b3 BootID:183b4d43-5b49-4849-8b14-33dd87d52f01 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c2:3d:a8:66:bf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c2:3d:a8:66:bf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:0a:3a:77:c1:fb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:34:06.888283 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.888275 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:34:06.888452 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.888384 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:34:06.889629 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.889603 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:34:06.889794 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.889631 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:34:06.889877 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.889808 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:34:06.889877 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.889822 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:34:06.889877 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.889841 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:06.890767 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.890754 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:06.892614 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.892601 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:06.892736 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.892725 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:34:06.895530 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.895519 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:34:06.895586 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.895536 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:34:06.895586 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.895552 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:34:06.895586 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.895566 2572 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:34:06.895586 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.895579 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:34:06.896804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.896790 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:06.896867 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.896813 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:06.900094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.900068 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:34:06.901856 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.901844 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:34:06.904305 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904294 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904312 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904319 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904324 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904330 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904336 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904342 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904347 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904353 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904366 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:34:06.904377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904378 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:34:06.904653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.904387 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:34:06.905555 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.905546 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:34:06.905593 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.905555 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:34:06.908646 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.908629 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:34:06.908784 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.908769 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:34:06.908819 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.908782 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:34:06.909112 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.909101 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:34:06.909144 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.909136 2572 server.go:1295] "Started kubelet" Apr 23 16:34:06.909303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.909247 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:34:06.909385 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.909283 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:34:06.909385 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.909358 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:34:06.909950 ip-10-0-133-231 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:34:06.910844 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.910828 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:34:06.912112 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.912097 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:34:06.916045 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.914932 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-231.ec2.internal.18a90991215e5f35 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-231.ec2.internal,UID:ip-10-0-133-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-231.ec2.internal,},FirstTimestamp:2026-04-23 16:34:06.909112117 +0000 UTC m=+0.481625029,LastTimestamp:2026-04-23 16:34:06.909112117 +0000 UTC m=+0.481625029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-231.ec2.internal,}" Apr 23 16:34:06.918427 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.918407 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:34:06.919776 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.919755 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:06.920398 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.920382 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:34:06.922583 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.922461 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:34:06.922673 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.922587 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:34:06.922673 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.922539 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:34:06.922784 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.922772 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:34:06.922784 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.922784 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:34:06.922859 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.922781 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:06.924411 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924394 2572 factory.go:153] Registering CRI-O factory Apr 23 16:34:06.924490 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924480 2572 factory.go:223] Registration of the crio container factory successfully Apr 23 16:34:06.924579 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924559 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:34:06.924579 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924575 2572 factory.go:55] Registering systemd factory Apr 23 16:34:06.924710 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924585 2572 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:34:06.924710 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924610 2572 factory.go:103] Registering Raw factory Apr 23 16:34:06.924710 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.924628 2572 manager.go:1196] Started watching for new ooms in manager Apr 23 16:34:06.925383 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.925368 2572 manager.go:319] Starting recovery of all containers Apr 23 16:34:06.935616 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.935603 2572 manager.go:324] Recovery completed Apr 23 16:34:06.936066 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.936047 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:34:06.936066 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.936049 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:34:06.939372 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.939360 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:06.941897 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.941870 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:06.941999 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.941912 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:06.941999 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.941941 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:06.942501 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.942488 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:34:06.942501 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.942501 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:34:06.942585 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.942544 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:06.946421 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.946409 2572 policy_none.go:49] "None policy: Start" Apr 23 16:34:06.946462 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.946425 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:34:06.946462 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.946435 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:34:06.955223 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.955206 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d2wt9" Apr 23 16:34:06.955665 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.955599 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-231.ec2.internal.18a909912352a001 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-231.ec2.internal,UID:ip-10-0-133-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-231.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-231.ec2.internal,},FirstTimestamp:2026-04-23 16:34:06.941896705 +0000 UTC m=+0.514409620,LastTimestamp:2026-04-23 16:34:06.941896705 +0000 UTC m=+0.514409620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-231.ec2.internal,}" Apr 23 16:34:06.967024 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.966997 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d2wt9" Apr 23 16:34:06.967099 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.966939 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-231.ec2.internal.18a9099123532d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-231.ec2.internal,UID:ip-10-0-133-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-231.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-231.ec2.internal,},FirstTimestamp:2026-04-23 16:34:06.94193289 +0000 UTC m=+0.514445803,LastTimestamp:2026-04-23 16:34:06.94193289 +0000 UTC m=+0.514445803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-231.ec2.internal,}" Apr 23 16:34:06.987435 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987419 2572 manager.go:341] "Starting Device Plugin manager" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.987454 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987463 2572 server.go:85] "Starting device plugin registration server" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987697 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987709 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987792 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987863 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:06.987869 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.988494 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:34:07.005055 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:06.988547 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.024338 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.024315 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:34:07.025503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.025487 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:34:07.025584 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.025516 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:34:07.025584 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.025538 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:34:07.025584 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.025549 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:34:07.025714 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.025588 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:34:07.028706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.028681 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:07.087904 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.087838 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:07.088788 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.088773 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:07.088861 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.088800 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:07.088861 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.088812 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:07.088861 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.088838 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.098040 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.098023 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.098084 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.098045 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-231.ec2.internal\": node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.114143 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.114119 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.126345 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.126313 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal"] Apr 23 16:34:07.126408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.126399 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:07.131907 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.131892 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:07.131993 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.131930 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:07.131993 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.131942 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:07.134302 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.134291 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:07.134444 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.134430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.134492 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.134458 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:07.135142 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135124 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:07.135229 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135161 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:07.135229 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135177 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:07.135303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135233 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:07.135303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135261 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:07.135303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.135271 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:07.137496 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.137481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.137568 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.137505 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:07.138121 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.138106 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:07.138196 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.138135 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:07.138196 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.138151 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:07.160516 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.160487 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-231.ec2.internal\" not found" node="ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.164842 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.164826 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-231.ec2.internal\" not found" node="ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.214881 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.214863 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.223445 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.223425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.223504 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.223461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.223504 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.223490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.315193 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.315169 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.324628 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.324696 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.324696 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.324793 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1eb73e8eaa1b833503296b19a264c17c-config\") pod \"kube-apiserver-proxy-ip-10-0-133-231.ec2.internal\" (UID: \"1eb73e8eaa1b833503296b19a264c17c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.324793 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.324865 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.324785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23282ac0c4062a2415d09037c9c74b84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal\" (UID: \"23282ac0c4062a2415d09037c9c74b84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.416022 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.415989 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.462591 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.462564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.467335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.467303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:07.516878 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.516853 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.617510 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.617478 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.718094 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.718034 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.817681 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.817654 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:34:07.818354 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.817801 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:07.818748 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.818726 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.919232 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:07.919198 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:07.919867 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.919852 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:07.923082 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.923062 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:07.933347 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.933322 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:07.962117 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.962096 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zpzcs" Apr 23 16:34:07.971251 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.971197 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:29:06 +0000 UTC" deadline="2027-12-14 00:44:33.212844725 +0000 UTC" Apr 23 16:34:07.971251 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.971222 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14384h10m25.241626097s" Apr 23 16:34:07.972738 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:07.972718 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zpzcs" Apr 23 16:34:08.020327 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.020304 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:08.044469 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:08.044440 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eb73e8eaa1b833503296b19a264c17c.slice/crio-8f0c4649afee8cea4d2e2e04f41d8e19b4afa47e78eed9509822b01ffdf7f161 WatchSource:0}: Error finding container 8f0c4649afee8cea4d2e2e04f41d8e19b4afa47e78eed9509822b01ffdf7f161: Status 404 returned error can't find the container with id 8f0c4649afee8cea4d2e2e04f41d8e19b4afa47e78eed9509822b01ffdf7f161 Apr 23 16:34:08.045018 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:08.044994 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23282ac0c4062a2415d09037c9c74b84.slice/crio-cf8de62813c84ce2c0818d3070ab675f627386b03cba0cd91f45a523facef85b WatchSource:0}: Error finding container cf8de62813c84ce2c0818d3070ab675f627386b03cba0cd91f45a523facef85b: Status 404 returned error can't find the container with id cf8de62813c84ce2c0818d3070ab675f627386b03cba0cd91f45a523facef85b Apr 23 16:34:08.048538 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.048524 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:34:08.121452 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.121419 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:08.137350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.137331 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:08.221672 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.221604 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:08.322126 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.322097 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-231.ec2.internal\" not found" Apr 23 16:34:08.400234 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.400209 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:08.421333 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.421313 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" Apr 23 16:34:08.441505 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.441481 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:08.442534 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.442509 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" Apr 23 16:34:08.451248 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.451227 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:08.897246 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.896980 2572 apiserver.go:52] "Watching apiserver" Apr 23 16:34:08.907325 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.907288 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:34:08.907742 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.907715 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-cx8tc","openshift-ovn-kubernetes/ovnkube-node-95t7c","kube-system/konnectivity-agent-jn7xj","kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm","openshift-cluster-node-tuning-operator/tuned-sjmb9","openshift-dns/node-resolver-mqskj","openshift-image-registry/node-ca-dp4mb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal","openshift-multus/multus-additional-cni-plugins-r652k","openshift-multus/multus-wjwgw","openshift-multus/network-metrics-daemon-glcj7","openshift-network-diagnostics/network-check-target-d7t2c"] Apr 23 16:34:08.910708 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.910690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.913009 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.912989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.913566 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.913539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:34:08.914049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.913795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qlnsr\"" Apr 23 16:34:08.914049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.913841 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.914049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.913862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.914049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.913900 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:34:08.914288 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.914184 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:34:08.917890 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.917224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:34:08.917890 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.917268 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.917890 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.917583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:34:08.917890 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.917696 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.918151 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.917898 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:34:08.918151 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.918036 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:34:08.918541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.918313 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nhpz7\"" Apr 23 16:34:08.919396 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.919377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:08.922167 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.921736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.923991 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.923960 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:34:08.924184 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.924166 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.924682 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.924650 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:34:08.924798 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.924763 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.924955 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.924934 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7b98n\"" Apr 23 16:34:08.925301 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.925282 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.925385 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.925330 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:34:08.925613 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.925594 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-94hm9\"" Apr 23 16:34:08.926646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.926287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:08.926646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.926418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:08.926786 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.926672 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r7zt2\"" Apr 23 16:34:08.926786 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.926712 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.926975 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.926956 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.928510 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.928454 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnk44\"" Apr 23 16:34:08.928716 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.928681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.928779 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.928723 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:34:08.928982 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.928948 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.929156 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.929138 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.929156 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.929154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.929265 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.929194 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8jkvt\"" Apr 23 16:34:08.931612 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931331 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:08.931833 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-slash\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.931932 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-systemd\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.931932 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-netns\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-var-lib-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932049 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.931986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-socket-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.932152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-kubernetes\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.932152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/613078ff-a7b4-43be-9362-ff4e9be86af1-agent-certs\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:08.932152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.932152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-run\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysconfig\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmpl\" (UniqueName: \"kubernetes.io/projected/821ae57d-81ac-4242-a0c9-51cdf1716096-kube-api-access-jdmpl\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-systemd-units\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-bin\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-conf\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-etc-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-netd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxmlv\" (UniqueName: \"kubernetes.io/projected/4b2c8879-054c-4712-b5f0-7d3038cf3e84-kube-api-access-wxmlv\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24r5n\" (UniqueName: \"kubernetes.io/projected/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kube-api-access-24r5n\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-script-lib\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-ovn\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovn-node-metrics-cert\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/613078ff-a7b4-43be-9362-ff4e9be86af1-konnectivity-ca\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-registration-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-etc-selinux\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.932733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7789b2de-75cc-4057-8981-8850b48ac765-tmp-dir\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-device-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-lib-modules\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7789b2de-75cc-4057-8981-8850b48ac765-hosts-file\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932829 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cnibin\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-kubelet\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.932883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-sys\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933036 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvtz\" (UniqueName: \"kubernetes.io/projected/7789b2de-75cc-4057-8981-8850b48ac765-kube-api-access-mvvtz\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-env-overrides\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-tmp\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-config\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-system-cni-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-os-release\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qss\" (UniqueName: \"kubernetes.io/projected/f24243b7-5732-41e7-a97d-ff3ef6a751d0-kube-api-access-f7qss\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-node-log\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-log-socket\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-var-lib-kubelet\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-sys-fs\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-modprobe-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-tuned\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-systemd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wjwgw" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-host\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.933805 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.933884 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.934004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:08.934108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.934108 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:08.934791 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.934243 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9zqk6\"" Apr 23 16:34:08.936174 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.936099 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:08.936174 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:08.936162 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:08.936599 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.936585 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:34:08.937416 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.937397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tx5dc\"" Apr 23 16:34:08.973373 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.973344 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:07 +0000 UTC" deadline="2027-11-08 04:22:52.417920774 +0000 UTC" Apr 23 16:34:08.973373 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:08.973372 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13523h48m43.44455224s" Apr 23 16:34:09.024257 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.024231 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:34:09.029804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.029755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerStarted","Data":"cf8de62813c84ce2c0818d3070ab675f627386b03cba0cd91f45a523facef85b"} Apr 23 16:34:09.031045 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.031018 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" event={"ID":"1eb73e8eaa1b833503296b19a264c17c","Type":"ContainerStarted","Data":"8f0c4649afee8cea4d2e2e04f41d8e19b4afa47e78eed9509822b01ffdf7f161"} Apr 23 16:34:09.033989 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.033968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28c8\" (UniqueName: \"kubernetes.io/projected/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-kube-api-access-q28c8\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.034084 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.034148 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovn-node-metrics-cert\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.034148 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-cni-binary-copy\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.034247 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.034247 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.034247 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/613078ff-a7b4-43be-9362-ff4e9be86af1-konnectivity-ca\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:09.034247 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-registration-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-registration-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-etc-selinux\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7789b2de-75cc-4057-8981-8850b48ac765-tmp-dir\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-host-slash\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-device-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.034422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-etc-selinux\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-lib-modules\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7789b2de-75cc-4057-8981-8850b48ac765-hosts-file\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cnibin\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-kubelet\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-sys\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvtz\" (UniqueName: \"kubernetes.io/projected/7789b2de-75cc-4057-8981-8850b48ac765-kube-api-access-mvvtz\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034600 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:34:09.034648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-cnibin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-conf-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-daemon-config\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-lib-modules\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7789b2de-75cc-4057-8981-8850b48ac765-tmp-dir\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-env-overrides\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-device-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-multus-certs\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-tmp\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7789b2de-75cc-4057-8981-8850b48ac765-hosts-file\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/613078ff-a7b4-43be-9362-ff4e9be86af1-konnectivity-ca\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-config\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.034868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-system-cni-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-os-release\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cnibin\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qss\" (UniqueName: \"kubernetes.io/projected/f24243b7-5732-41e7-a97d-ff3ef6a751d0-kube-api-access-f7qss\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-node-log\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-log-socket\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-env-overrides\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-var-lib-kubelet\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-sys-fs\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-modprobe-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-tuned\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035409 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-socket-dir-parent\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-config\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-systemd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-host\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.035769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-netns\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-system-cni-dir\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f24243b7-5732-41e7-a97d-ff3ef6a751d0-os-release\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-multus\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-kubelet\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-systemd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-serviceca\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-log-socket\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-slash\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-host\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-systemd\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-kubelet\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-netns\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-slash\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-var-lib-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.036679 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-sys-fs\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-sys\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-socket-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-var-lib-kubelet\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.035981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-kubernetes\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qbm\" (UniqueName: \"kubernetes.io/projected/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-kube-api-access-l8qbm\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-systemd\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-modprobe-d\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-netns\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfqf\" (UniqueName: \"kubernetes.io/projected/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-kube-api-access-lbfqf\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/613078ff-a7b4-43be-9362-ff4e9be86af1-agent-certs\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036071 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-var-lib-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-socket-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-run\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-kubernetes\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.037503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-node-log\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-k8s-cni-cncf-io\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-run\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysconfig\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmpl\" (UniqueName: \"kubernetes.io/projected/821ae57d-81ac-4242-a0c9-51cdf1716096-kube-api-access-jdmpl\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-system-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-os-release\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-bin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-etc-kubernetes\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-host\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-systemd-units\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysconfig\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-bin\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-conf\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-systemd-units\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-bin\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.038502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-iptables-alerter-script\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwd7s\" (UniqueName: \"kubernetes.io/projected/bf96e64c-13d1-4533-b0fb-a69566795f63-kube-api-access-lwd7s\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-sysctl-conf\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-etc-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-netd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxmlv\" (UniqueName: \"kubernetes.io/projected/4b2c8879-054c-4712-b5f0-7d3038cf3e84-kube-api-access-wxmlv\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-etc-openvswitch\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24r5n\" (UniqueName: \"kubernetes.io/projected/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kube-api-access-24r5n\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-cni-netd\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-script-lib\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-hostroot\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039289 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.036966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-ovn\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.037050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b2c8879-054c-4712-b5f0-7d3038cf3e84-run-ovn\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.037582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovnkube-script-lib\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.037958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f24243b7-5732-41e7-a97d-ff3ef6a751d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.038309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b2c8879-054c-4712-b5f0-7d3038cf3e84-ovn-node-metrics-cert\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.038768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-etc-tuned\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.038900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/821ae57d-81ac-4242-a0c9-51cdf1716096-tmp\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.039804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.039383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/613078ff-a7b4-43be-9362-ff4e9be86af1-agent-certs\") pod \"konnectivity-agent-jn7xj\" (UID: \"613078ff-a7b4-43be-9362-ff4e9be86af1\") " pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:09.047088 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.047055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qss\" (UniqueName: \"kubernetes.io/projected/f24243b7-5732-41e7-a97d-ff3ef6a751d0-kube-api-access-f7qss\") pod \"multus-additional-cni-plugins-r652k\" (UID: \"f24243b7-5732-41e7-a97d-ff3ef6a751d0\") " pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.047184 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.047141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmpl\" (UniqueName: \"kubernetes.io/projected/821ae57d-81ac-4242-a0c9-51cdf1716096-kube-api-access-jdmpl\") pod \"tuned-sjmb9\" (UID: \"821ae57d-81ac-4242-a0c9-51cdf1716096\") " pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.047455 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.047438 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvtz\" (UniqueName: \"kubernetes.io/projected/7789b2de-75cc-4057-8981-8850b48ac765-kube-api-access-mvvtz\") pod \"node-resolver-mqskj\" (UID: \"7789b2de-75cc-4057-8981-8850b48ac765\") " pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.048336 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.048316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxmlv\" (UniqueName: \"kubernetes.io/projected/4b2c8879-054c-4712-b5f0-7d3038cf3e84-kube-api-access-wxmlv\") pod \"ovnkube-node-95t7c\" (UID: \"4b2c8879-054c-4712-b5f0-7d3038cf3e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.049174 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.049143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24r5n\" (UniqueName: \"kubernetes.io/projected/ac2dc8cd-1e1d-484d-a299-65eb94658e63-kube-api-access-24r5n\") pod \"aws-ebs-csi-driver-node-9csvm\" (UID: \"ac2dc8cd-1e1d-484d-a299-65eb94658e63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.138136 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-cnibin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138136 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-conf-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-daemon-config\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-multus-certs\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-cnibin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-socket-dir-parent\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-conf-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-netns\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-multus-certs\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-multus\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138345 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-netns\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-serviceca\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.138362 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-multus\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-socket-dir-parent\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-kubelet\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qbm\" (UniqueName: \"kubernetes.io/projected/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-kube-api-access-l8qbm\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-kubelet\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfqf\" (UniqueName: \"kubernetes.io/projected/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-kube-api-access-lbfqf\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-k8s-cni-cncf-io\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-system-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138570 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-run-k8s-cni-cncf-io\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-os-release\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-bin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-system-cni-dir\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-etc-kubernetes\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-host-var-lib-cni-bin\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-os-release\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-etc-kubernetes\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-host\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-iptables-alerter-script\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.138943 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwd7s\" (UniqueName: \"kubernetes.io/projected/bf96e64c-13d1-4533-b0fb-a69566795f63-kube-api-access-lwd7s\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-host\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-hostroot\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-serviceca\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf96e64c-13d1-4533-b0fb-a69566795f63-hostroot\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q28c8\" (UniqueName: \"kubernetes.io/projected/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-kube-api-access-q28c8\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.138891 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-multus-daemon-config\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.138979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:09.638958344 +0000 UTC m=+3.211471275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.138893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-cni-binary-copy\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.139040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-host-slash\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.139139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-host-slash\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.139340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-iptables-alerter-script\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.139653 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.139445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf96e64c-13d1-4533-b0fb-a69566795f63-cni-binary-copy\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.154229 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.154105 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:09.154229 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.154133 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:09.154229 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.154144 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:09.154229 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.154212 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:09.654195513 +0000 UTC m=+3.226708445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:09.156964 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.156946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28c8\" (UniqueName: \"kubernetes.io/projected/10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa-kube-api-access-q28c8\") pod \"node-ca-dp4mb\" (UID: \"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa\") " pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.157146 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.157123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfqf\" (UniqueName: \"kubernetes.io/projected/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-kube-api-access-lbfqf\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.158548 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.158525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwd7s\" (UniqueName: \"kubernetes.io/projected/bf96e64c-13d1-4533-b0fb-a69566795f63-kube-api-access-lwd7s\") pod \"multus-wjwgw\" (UID: \"bf96e64c-13d1-4533-b0fb-a69566795f63\") " pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.158639 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.158590 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:09.158699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.158674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qbm\" (UniqueName: \"kubernetes.io/projected/0cb82eee-8d58-46f1-8148-5a83f7d6a3a1-kube-api-access-l8qbm\") pod \"iptables-alerter-cx8tc\" (UID: \"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1\") " pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.225866 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.225836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r652k" Apr 23 16:34:09.235532 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.235510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:09.244170 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.244151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:09.249757 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.249739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" Apr 23 16:34:09.257360 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.257343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" Apr 23 16:34:09.266869 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.266844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mqskj" Apr 23 16:34:09.276331 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.276309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dp4mb" Apr 23 16:34:09.282804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.282788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cx8tc" Apr 23 16:34:09.288339 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.288309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wjwgw" Apr 23 16:34:09.643062 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.643030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:09.643233 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.643158 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:09.643233 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.643221 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:10.64320479 +0000 UTC m=+4.215717705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:09.722697 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.722605 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7789b2de_75cc_4057_8981_8850b48ac765.slice/crio-d0768cc98c31111e10df8c8b8f855d4938ecee3dd2ea88b7c0597cdff283cba0 WatchSource:0}: Error finding container d0768cc98c31111e10df8c8b8f855d4938ecee3dd2ea88b7c0597cdff283cba0: Status 404 returned error can't find the container with id d0768cc98c31111e10df8c8b8f855d4938ecee3dd2ea88b7c0597cdff283cba0 Apr 23 16:34:09.723862 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.723806 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24243b7_5732_41e7_a97d_ff3ef6a751d0.slice/crio-e74f6d92a5d609b9b073fda8838a16e9bdfff2e2ab3b3a5e35cba47d9814872a WatchSource:0}: Error finding container e74f6d92a5d609b9b073fda8838a16e9bdfff2e2ab3b3a5e35cba47d9814872a: Status 404 returned error can't find the container with id e74f6d92a5d609b9b073fda8838a16e9bdfff2e2ab3b3a5e35cba47d9814872a Apr 23 16:34:09.743958 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.743909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:09.744042 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.744020 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:09.744042 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.744035 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:09.744042 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.744043 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:09.744140 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:09.744088 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:10.744072164 +0000 UTC m=+4.316585063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:09.747705 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.747646 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2dc8cd_1e1d_484d_a299_65eb94658e63.slice/crio-c25477fecddcbd880b1e1df708992848d9fa2b411b333534cc5d46c132289657 WatchSource:0}: Error finding container c25477fecddcbd880b1e1df708992848d9fa2b411b333534cc5d46c132289657: Status 404 returned error can't find the container with id c25477fecddcbd880b1e1df708992848d9fa2b411b333534cc5d46c132289657 Apr 23 16:34:09.749293 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.749273 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2c8879_054c_4712_b5f0_7d3038cf3e84.slice/crio-607cd053d0c7a41c95e15c143dea9de01a626301c330eaa3fe7d35b0aaa96e19 WatchSource:0}: Error finding container 607cd053d0c7a41c95e15c143dea9de01a626301c330eaa3fe7d35b0aaa96e19: Status 404 returned error can't find the container with id 607cd053d0c7a41c95e15c143dea9de01a626301c330eaa3fe7d35b0aaa96e19 Apr 23 16:34:09.750857 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.750825 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf96e64c_13d1_4533_b0fb_a69566795f63.slice/crio-78cd36cf4eb893e464c377dc734d129c86a368904f5df24d64abd78b4dce2828 WatchSource:0}: Error finding container 78cd36cf4eb893e464c377dc734d129c86a368904f5df24d64abd78b4dce2828: Status 404 returned error can't find the container with id 78cd36cf4eb893e464c377dc734d129c86a368904f5df24d64abd78b4dce2828 Apr 23 16:34:09.752268 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.752150 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb82eee_8d58_46f1_8148_5a83f7d6a3a1.slice/crio-6e29be3b97b5dfe4a87fe8e8d52be7b4d626dff1bbd628f41094079414343fb7 WatchSource:0}: Error finding container 6e29be3b97b5dfe4a87fe8e8d52be7b4d626dff1bbd628f41094079414343fb7: Status 404 returned error can't find the container with id 6e29be3b97b5dfe4a87fe8e8d52be7b4d626dff1bbd628f41094079414343fb7 Apr 23 16:34:09.754220 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.754193 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613078ff_a7b4_43be_9362_ff4e9be86af1.slice/crio-4e44553aaa5c6a42f39cc5468b5b2e61bf5ddfeb335e8438ccaacf2ebe7f7143 WatchSource:0}: Error finding container 4e44553aaa5c6a42f39cc5468b5b2e61bf5ddfeb335e8438ccaacf2ebe7f7143: Status 404 returned error can't find the container with id 4e44553aaa5c6a42f39cc5468b5b2e61bf5ddfeb335e8438ccaacf2ebe7f7143 Apr 23 16:34:09.756143 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:09.756053 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10bc9a50_0524_4e20_a0cb_cfbcb6a0f1fa.slice/crio-408cfe93684ce542d34b13faa7559a789b7d297581c807d555c2bbc0dd210181 WatchSource:0}: Error finding container 408cfe93684ce542d34b13faa7559a789b7d297581c807d555c2bbc0dd210181: Status 404 returned error can't find the container with id 408cfe93684ce542d34b13faa7559a789b7d297581c807d555c2bbc0dd210181 Apr 23 16:34:09.973973 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.973939 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:07 +0000 UTC" deadline="2027-12-18 08:32:51.762857519 +0000 UTC" Apr 23 16:34:09.973973 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:09.973970 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14487h58m41.788891472s" Apr 23 16:34:10.026338 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.026309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:10.026467 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.026427 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:10.034219 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.034192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" event={"ID":"821ae57d-81ac-4242-a0c9-51cdf1716096","Type":"ContainerStarted","Data":"c38e4b4b42600f281fa936f902cd503be3b52d478b19fdcea9bba8e48e7ffe3d"} Apr 23 16:34:10.035244 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.035220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cx8tc" event={"ID":"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1","Type":"ContainerStarted","Data":"6e29be3b97b5dfe4a87fe8e8d52be7b4d626dff1bbd628f41094079414343fb7"} Apr 23 16:34:10.036171 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.036141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" event={"ID":"ac2dc8cd-1e1d-484d-a299-65eb94658e63","Type":"ContainerStarted","Data":"c25477fecddcbd880b1e1df708992848d9fa2b411b333534cc5d46c132289657"} Apr 23 16:34:10.037165 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.037135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerStarted","Data":"e74f6d92a5d609b9b073fda8838a16e9bdfff2e2ab3b3a5e35cba47d9814872a"} Apr 23 16:34:10.038089 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.038064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mqskj" event={"ID":"7789b2de-75cc-4057-8981-8850b48ac765","Type":"ContainerStarted","Data":"d0768cc98c31111e10df8c8b8f855d4938ecee3dd2ea88b7c0597cdff283cba0"} Apr 23 16:34:10.039483 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.039464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" event={"ID":"1eb73e8eaa1b833503296b19a264c17c","Type":"ContainerStarted","Data":"6425ac14c57ca4f9435d00fab4c56ea42e2ef855b6ad2865d37a9b4203ce4857"} Apr 23 16:34:10.040454 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.040434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dp4mb" event={"ID":"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa","Type":"ContainerStarted","Data":"408cfe93684ce542d34b13faa7559a789b7d297581c807d555c2bbc0dd210181"} Apr 23 16:34:10.041438 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.041412 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jn7xj" event={"ID":"613078ff-a7b4-43be-9362-ff4e9be86af1","Type":"ContainerStarted","Data":"4e44553aaa5c6a42f39cc5468b5b2e61bf5ddfeb335e8438ccaacf2ebe7f7143"} Apr 23 16:34:10.042384 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.042361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wjwgw" event={"ID":"bf96e64c-13d1-4533-b0fb-a69566795f63","Type":"ContainerStarted","Data":"78cd36cf4eb893e464c377dc734d129c86a368904f5df24d64abd78b4dce2828"} Apr 23 16:34:10.043349 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.043318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"607cd053d0c7a41c95e15c143dea9de01a626301c330eaa3fe7d35b0aaa96e19"} Apr 23 16:34:10.054420 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.054386 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-231.ec2.internal" podStartSLOduration=2.054374871 podStartE2EDuration="2.054374871s" podCreationTimestamp="2026-04-23 16:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:10.054324773 +0000 UTC m=+3.626837696" watchObservedRunningTime="2026-04-23 16:34:10.054374871 +0000 UTC m=+3.626887791" Apr 23 16:34:10.657130 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.656446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:10.657130 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.656656 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:10.657130 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.656721 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:12.656701705 +0000 UTC m=+6.229214617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:10.757532 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:10.757491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:10.757726 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.757702 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:10.757726 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.757724 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:10.757836 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.757737 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:10.757836 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:10.757798 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:12.757779732 +0000 UTC m=+6.330292638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:11.026993 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:11.026873 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:11.027386 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:11.027031 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:11.060480 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:11.060447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerStarted","Data":"78fad2d5dad97ff11deba5a0394a11412acf097bf90efc1c22c24fd2918d361a"} Apr 23 16:34:12.026825 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:12.026789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:12.027019 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.026953 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:12.074252 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:12.074217 2572 generic.go:358] "Generic (PLEG): container finished" podID="23282ac0c4062a2415d09037c9c74b84" containerID="78fad2d5dad97ff11deba5a0394a11412acf097bf90efc1c22c24fd2918d361a" exitCode=0 Apr 23 16:34:12.074425 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:12.074273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerDied","Data":"78fad2d5dad97ff11deba5a0394a11412acf097bf90efc1c22c24fd2918d361a"} Apr 23 16:34:12.671441 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:12.671267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:12.671441 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.671408 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:12.671675 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.671479 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:16.671459677 +0000 UTC m=+10.243972581 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:12.772667 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:12.772580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:12.772830 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.772771 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:12.772830 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.772796 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:12.772830 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.772811 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:12.773014 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:12.772877 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:16.772857045 +0000 UTC m=+10.345369963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:13.026431 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:13.026354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:13.026594 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:13.026501 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:14.026275 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:14.026240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:14.026736 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:14.026374 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:15.026637 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:15.026604 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:15.027101 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:15.026746 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:16.026428 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:16.026399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:16.026600 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.026518 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:16.703247 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:16.703206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:16.703713 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.703384 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:16.703713 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.703452 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:24.703432382 +0000 UTC m=+18.275945286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:16.804022 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:16.803982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:16.804187 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.804141 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:16.804187 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.804163 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:16.804187 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.804185 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:16.804346 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:16.804243 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:24.804225039 +0000 UTC m=+18.376737963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:17.030793 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:17.030291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:17.030793 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:17.030417 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:18.027013 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:18.026676 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:18.027013 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:18.026811 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:19.030080 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:19.030023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:19.030521 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:19.030169 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:20.026061 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:20.026034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:20.026232 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:20.026143 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:21.025925 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:21.025892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:21.026384 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:21.026059 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:22.026444 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:22.026411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:22.026884 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:22.026515 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:23.029545 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:23.029516 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:23.029975 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:23.029660 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:24.025791 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:24.025755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:24.026012 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.025896 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:24.760960 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:24.760898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:24.761398 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.761079 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:24.761398 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.761160 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.761140615 +0000 UTC m=+34.333653527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:24.862284 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:24.862251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:24.862464 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.862380 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:24.862464 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.862396 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:24.862464 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.862406 2572 projected.go:194] Error preparing data for projected volume kube-api-access-2sv2v for pod openshift-network-diagnostics/network-check-target-d7t2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:24.862464 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:24.862463 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v podName:1fcbd9bc-88ba-48d1-978b-f8e2585ab84c nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.862444372 +0000 UTC m=+34.434957283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2sv2v" (UniqueName: "kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v") pod "network-check-target-d7t2c" (UID: "1fcbd9bc-88ba-48d1-978b-f8e2585ab84c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:25.029552 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:25.029473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:25.029697 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:25.029583 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:26.026519 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:26.026483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:26.026900 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:26.026602 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:27.027339 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:27.027134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:27.027764 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:27.027449 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:27.102312 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:27.102271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" event={"ID":"23282ac0c4062a2415d09037c9c74b84","Type":"ContainerStarted","Data":"8985066bce220c09654f7c913677b7834b05b656406bfcc771dabbf0a84635bd"} Apr 23 16:34:27.103557 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:27.103530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jn7xj" event={"ID":"613078ff-a7b4-43be-9362-ff4e9be86af1","Type":"ContainerStarted","Data":"d7fc8d32d1b9c9643f725a718efdd9db755d1e6654b95d707637396278c6472f"} Apr 23 16:34:27.130942 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:27.130589 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-231.ec2.internal" podStartSLOduration=19.13055059 podStartE2EDuration="19.13055059s" podCreationTimestamp="2026-04-23 16:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:27.13009611 +0000 UTC m=+20.702609032" watchObservedRunningTime="2026-04-23 16:34:27.13055059 +0000 UTC m=+20.703063512" Apr 23 16:34:28.026955 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.026721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:28.027145 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:28.027045 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:28.107564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.107526 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" event={"ID":"821ae57d-81ac-4242-a0c9-51cdf1716096","Type":"ContainerStarted","Data":"ea6fc2171fb956f72fc68598fa19a2ab0b2d39ead8cbbe42a80d8e5e076b3d46"} Apr 23 16:34:28.109226 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.109190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" event={"ID":"ac2dc8cd-1e1d-484d-a299-65eb94658e63","Type":"ContainerStarted","Data":"495778c2eb582bace8e334c4d7fbb7141105ba7f4e6ad3edf7d9e78c369828c6"} Apr 23 16:34:28.110738 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.110710 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="9c69eb4d8d0e2ac5c3e692fdeed42740f39bd7e029c1402b1006b899099d9198" exitCode=0 Apr 23 16:34:28.110858 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.110791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"9c69eb4d8d0e2ac5c3e692fdeed42740f39bd7e029c1402b1006b899099d9198"} Apr 23 16:34:28.112418 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.112313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mqskj" event={"ID":"7789b2de-75cc-4057-8981-8850b48ac765","Type":"ContainerStarted","Data":"60cb11ecaf5d5079cea1d6c75174ca95eaea92084790cb7fcfdfe8cba51e9dbf"} Apr 23 16:34:28.113870 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.113846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dp4mb" event={"ID":"10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa","Type":"ContainerStarted","Data":"b85ab30486df56b9aa738a786b83d59a5ccfad21803fc5d3abb61e53c7dab5de"} Apr 23 16:34:28.115508 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.115485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wjwgw" event={"ID":"bf96e64c-13d1-4533-b0fb-a69566795f63","Type":"ContainerStarted","Data":"72758634d69c45eba6f4a7c9a0f4727d0d9e58cc6e6ea8abd4c68af1d5da08cc"} Apr 23 16:34:28.118695 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"4c0936c6fd91a3ced45e1687f067fefafc6d2b429958a93e4e6dd1bc1be53355"} Apr 23 16:34:28.118832 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"f0047875e7512ee3b43b4978f3908ceb51c8b8351857befd6b178f7d3de29bed"} Apr 23 16:34:28.118832 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"9410d027d41857ed367927b31dec085af73d999787e37be2cbe8826866ad0817"} Apr 23 16:34:28.118832 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"3f1e64222794a77d1c0338116ae9e9d3ae8959ca2b5bddf9859bca50dc8a6a3d"} Apr 23 16:34:28.118832 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"ca17cbc16a633e37f42cebdcbbe1f364895ce97c88d02fa5fbf3484ed5dafa93"} Apr 23 16:34:28.118832 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.118732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"92f94bf667fa89e6986771fe2917d4f41f396fb1a5a8fa19261337c5b190545c"} Apr 23 16:34:28.142474 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.142429 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jn7xj" podStartSLOduration=8.620042276 podStartE2EDuration="21.14241764s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.75684658 +0000 UTC m=+3.329359489" lastFinishedPulling="2026-04-23 16:34:22.279221954 +0000 UTC m=+15.851734853" observedRunningTime="2026-04-23 16:34:27.148285398 +0000 UTC m=+20.720798320" watchObservedRunningTime="2026-04-23 16:34:28.14241764 +0000 UTC m=+21.714930560" Apr 23 16:34:28.142635 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.142610 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sjmb9" podStartSLOduration=4.002593612 podStartE2EDuration="21.142603478s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.746799814 +0000 UTC m=+3.319312713" lastFinishedPulling="2026-04-23 16:34:26.886809667 +0000 UTC m=+20.459322579" observedRunningTime="2026-04-23 16:34:28.142041726 +0000 UTC m=+21.714554647" watchObservedRunningTime="2026-04-23 16:34:28.142603478 +0000 UTC m=+21.715116399" Apr 23 16:34:28.158296 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.158254 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mqskj" podStartSLOduration=3.997373046 podStartE2EDuration="21.158243501s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.724560464 +0000 UTC m=+3.297073363" lastFinishedPulling="2026-04-23 16:34:26.885430904 +0000 UTC m=+20.457943818" observedRunningTime="2026-04-23 16:34:28.157895503 +0000 UTC m=+21.730408424" watchObservedRunningTime="2026-04-23 16:34:28.158243501 +0000 UTC m=+21.730756422" Apr 23 16:34:28.245796 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.245757 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dp4mb" podStartSLOduration=4.118485109 podStartE2EDuration="21.245743647s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.75795926 +0000 UTC m=+3.330472166" lastFinishedPulling="2026-04-23 16:34:26.885217799 +0000 UTC m=+20.457730704" observedRunningTime="2026-04-23 16:34:28.213573961 +0000 UTC m=+21.786086903" watchObservedRunningTime="2026-04-23 16:34:28.245743647 +0000 UTC m=+21.818256568" Apr 23 16:34:28.245945 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.245905 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wjwgw" podStartSLOduration=3.795854845 podStartE2EDuration="21.245901318s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.75700825 +0000 UTC m=+3.329521154" lastFinishedPulling="2026-04-23 16:34:27.207054714 +0000 UTC m=+20.779567627" observedRunningTime="2026-04-23 16:34:28.245284141 +0000 UTC m=+21.817797061" watchObservedRunningTime="2026-04-23 16:34:28.245901318 +0000 UTC m=+21.818414238" Apr 23 16:34:28.362688 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:28.362530 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:34:29.001771 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.001664 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:34:28.362686054Z","UUID":"4aee313a-1cfc-490d-82d7-5e7aa17fecfc","Handler":null,"Name":"","Endpoint":""} Apr 23 16:34:29.003343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.003320 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:34:29.003343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.003342 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:34:29.028644 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.028618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:29.028786 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:29.028743 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:29.121979 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.121942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cx8tc" event={"ID":"0cb82eee-8d58-46f1-8148-5a83f7d6a3a1","Type":"ContainerStarted","Data":"7e91b57a6329cd5673992bb8657c10a94d1548817f3a119d0107a1dc35e428a9"} Apr 23 16:34:29.123569 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.123541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" event={"ID":"ac2dc8cd-1e1d-484d-a299-65eb94658e63","Type":"ContainerStarted","Data":"85489a14cfccb4f60b32bca77813521db54313e64038bc8e3e72284f2aa9c3fa"} Apr 23 16:34:29.141843 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:29.141802 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cx8tc" podStartSLOduration=5.012835487 podStartE2EDuration="22.141789795s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.756231943 +0000 UTC m=+3.328744843" lastFinishedPulling="2026-04-23 16:34:26.885186242 +0000 UTC m=+20.457699151" observedRunningTime="2026-04-23 16:34:29.141695577 +0000 UTC m=+22.714208519" watchObservedRunningTime="2026-04-23 16:34:29.141789795 +0000 UTC m=+22.714302715" Apr 23 16:34:30.025745 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.025721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:30.025909 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:30.025813 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:30.129561 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.129520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"863a636e2eaef969659f151cdc9904a4b12b58d3641f88b0a6febeddc46dc7a0"} Apr 23 16:34:30.131581 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.131548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" event={"ID":"ac2dc8cd-1e1d-484d-a299-65eb94658e63","Type":"ContainerStarted","Data":"e239e0b23c1fe042e3d2af49f74834813b040d9b598ca7e7c2a1c82e18bab9c5"} Apr 23 16:34:30.926461 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.926423 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:30.927889 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.927685 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:30.944502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:30.944444 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9csvm" podStartSLOduration=4.523237067 podStartE2EDuration="23.944429241s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.749656536 +0000 UTC m=+3.322169434" lastFinishedPulling="2026-04-23 16:34:29.17084871 +0000 UTC m=+22.743361608" observedRunningTime="2026-04-23 16:34:30.168975204 +0000 UTC m=+23.741488126" watchObservedRunningTime="2026-04-23 16:34:30.944429241 +0000 UTC m=+24.516942165" Apr 23 16:34:31.026796 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:31.026750 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:31.026974 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:31.026886 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:31.233540 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:31.233459 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:31.234136 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:31.234001 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jn7xj" Apr 23 16:34:32.026064 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.025721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:32.026306 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:32.026099 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:32.139097 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.139064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" event={"ID":"4b2c8879-054c-4712-b5f0-7d3038cf3e84","Type":"ContainerStarted","Data":"aae2cd859999115446135b69190724e0081a523f075ca268fb226500d09b7d14"} Apr 23 16:34:32.139272 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.139255 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:32.139323 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.139285 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:32.139402 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.139387 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:32.154427 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.154397 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:32.157366 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.157344 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:34:32.170686 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:32.170637 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" podStartSLOduration=7.751977773 podStartE2EDuration="25.170624626s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.751321899 +0000 UTC m=+3.323834799" lastFinishedPulling="2026-04-23 16:34:27.169968741 +0000 UTC m=+20.742481652" observedRunningTime="2026-04-23 16:34:32.169967482 +0000 UTC m=+25.742480406" watchObservedRunningTime="2026-04-23 16:34:32.170624626 +0000 UTC m=+25.743137548" Apr 23 16:34:33.028834 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:33.028562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:33.028834 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:33.028695 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:33.948622 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:33.948577 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d7t2c"] Apr 23 16:34:33.948770 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:33.948706 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:33.948813 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:33.948797 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:33.951432 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:33.951406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-glcj7"] Apr 23 16:34:33.951552 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:33.951539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:33.951671 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:33.951652 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:35.026439 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:35.026258 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:35.026799 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:35.026521 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:35.146495 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:35.146466 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="cdf2433a14e56447537c9a99f2c754414267754aef033eb87a4ceebe18a2f45c" exitCode=0 Apr 23 16:34:35.146660 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:35.146520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"cdf2433a14e56447537c9a99f2c754414267754aef033eb87a4ceebe18a2f45c"} Apr 23 16:34:36.026400 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:36.026326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:36.026545 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:36.026450 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:37.026415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:37.026381 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:37.026613 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:37.026471 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:37.152761 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:37.152724 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="a3ccf508d6f5a58904a2cc31bbbd99afb17d4a76e42a1e76a683c5a1268786b9" exitCode=0 Apr 23 16:34:37.152965 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:37.152768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"a3ccf508d6f5a58904a2cc31bbbd99afb17d4a76e42a1e76a683c5a1268786b9"} Apr 23 16:34:38.026603 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:38.026580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:38.026787 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:38.026680 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7t2c" podUID="1fcbd9bc-88ba-48d1-978b-f8e2585ab84c" Apr 23 16:34:38.157278 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:38.157237 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="3986f5a9619ae56fdf23261fa9e1774f7e4c3c5f5c816404c5e993fc56598c58" exitCode=0 Apr 23 16:34:38.157423 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:38.157296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"3986f5a9619ae56fdf23261fa9e1774f7e4c3c5f5c816404c5e993fc56598c58"} Apr 23 16:34:39.026131 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.026101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:39.026326 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.026241 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-glcj7" podUID="eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef" Apr 23 16:34:39.717386 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.717352 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-231.ec2.internal" event="NodeReady" Apr 23 16:34:39.717808 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.717540 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:34:39.756501 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.756463 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z"] Apr 23 16:34:39.760888 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.760864 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv"] Apr 23 16:34:39.761079 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.761047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.763855 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.763825 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.764282 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.764264 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 16:34:39.764578 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.764555 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 16:34:39.764862 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.764845 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:34:39.765082 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.765021 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" Apr 23 16:34:39.765315 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.765266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 16:34:39.765520 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.765500 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 16:34:39.765804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.765785 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.769531 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.769238 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:34:39.770621 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.769659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.773060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.773040 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk"] Apr 23 16:34:39.773227 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.773201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.776852 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.776834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.778199 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778110 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 16:34:39.778465 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:34:39.778739 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:34:39.778739 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:34:39.778739 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778663 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.779188 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.778835 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.779807 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.779505 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q6vbq\"" Apr 23 16:34:39.779807 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.779597 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-njn6c\"" Apr 23 16:34:39.779807 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.779657 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk"] Apr 23 16:34:39.782699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.782563 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 16:34:39.783392 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.783371 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997"] Apr 23 16:34:39.783558 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.783539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.791699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.788215 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.791699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.788994 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.791699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.789310 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 16:34:39.791699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.790065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2b8hh\"" Apr 23 16:34:39.791699 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.791313 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 16:34:39.792722 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.792691 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.792969 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.792946 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.793043 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.792969 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-sb2m5\"" Apr 23 16:34:39.793349 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.793327 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:34:39.794162 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.794142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 16:34:39.795845 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.795813 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bwh5t"] Apr 23 16:34:39.795961 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.795937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.799877 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.799858 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.801224 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.800695 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lfp22"] Apr 23 16:34:39.801224 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.801056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:39.801677 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.801659 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 16:34:39.801834 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.801805 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-s82cg\"" Apr 23 16:34:39.801902 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.801721 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.804342 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.804251 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl"] Apr 23 16:34:39.804983 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.804622 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.805733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.805714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 16:34:39.806990 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.806970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 16:34:39.806990 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.806982 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.807136 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.807071 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.807508 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.807317 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jqcts\"" Apr 23 16:34:39.808543 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.808520 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.808746 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.808729 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 16:34:39.808955 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.808936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:39.809134 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.808835 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n"] Apr 23 16:34:39.809896 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.809877 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.810244 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.810225 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lkfxh\"" Apr 23 16:34:39.810436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.810414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 16:34:39.813501 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.813463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zfkkk\"" Apr 23 16:34:39.813942 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.813670 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 16:34:39.816578 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.816559 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q"] Apr 23 16:34:39.818769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.818744 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 16:34:39.822859 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.822839 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 16:34:39.826266 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.826249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:39.826407 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.826151 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-77d69779b-8754l"] Apr 23 16:34:39.826566 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.826548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" Apr 23 16:34:39.828164 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.827996 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 16:34:39.829710 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.829680 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.830415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830184 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.830415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8n5r2\"" Apr 23 16:34:39.830415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830303 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9"] Apr 23 16:34:39.830415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830384 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.830843 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830413 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zk546\"" Apr 23 16:34:39.830843 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 16:34:39.831009 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.830982 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.831755 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.831228 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 16:34:39.834208 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.834176 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr"] Apr 23 16:34:39.834987 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.834944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.837258 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bkcjp\"" Apr 23 16:34:39.837353 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837319 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 16:34:39.837480 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837460 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv"] Apr 23 16:34:39.837541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837487 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z"] Apr 23 16:34:39.837541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837501 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:34:39.837541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9"] Apr 23 16:34:39.837541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837524 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997"] Apr 23 16:34:39.837541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837538 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hsn4f"] Apr 23 16:34:39.837777 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837542 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.837777 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837702 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:39.838143 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837889 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.838143 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.837980 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 16:34:39.838849 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.838831 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 16:34:39.840760 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.840739 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g9pgx"] Apr 23 16:34:39.840999 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.840979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:39.841081 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.841032 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:39.842270 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.842247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 16:34:39.842360 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.842300 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 16:34:39.843802 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.843778 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d6mts\"" Apr 23 16:34:39.844277 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844258 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl"] Apr 23 16:34:39.844473 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lfp22"] Apr 23 16:34:39.844562 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844496 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:39.844615 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844493 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:34:39.844725 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844564 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-77d69779b-8754l"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844743 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844758 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844769 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844783 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bwh5t"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844794 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n"] Apr 23 16:34:39.844804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844805 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr"] Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844815 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q"] Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844825 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g9pgx"] Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844835 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hsn4f"] Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844383 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nghfn\"" Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844882 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844929 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:34:39.845103 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.844957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:34:39.849382 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.849365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:34:39.850092 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.850077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tz9n7\"" Apr 23 16:34:39.850354 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.850342 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:34:39.871151 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.871229 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.871229 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970788cb-b97f-467f-bd8e-69787c8efef5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.871229 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2b7\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.871322 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8l7\" (UniqueName: \"kubernetes.io/projected/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-kube-api-access-rz8l7\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.871322 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.871322 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871426 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn5l\" (UniqueName: \"kubernetes.io/projected/970788cb-b97f-467f-bd8e-69787c8efef5-kube-api-access-gtn5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.871426 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871426 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871426 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970788cb-b97f-467f-bd8e-69787c8efef5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkvm\" (UniqueName: \"kubernetes.io/projected/6e3b3b5e-7dd7-421c-9733-f304050ddbce-kube-api-access-5xkvm\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69r2\" (UniqueName: \"kubernetes.io/projected/cef5ddfd-948f-4294-a2d2-9123e23feea6-kube-api-access-n69r2\") pod \"volume-data-source-validator-7c6cbb6c87-nqvgv\" (UID: \"cef5ddfd-948f-4294-a2d2-9123e23feea6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smh2t\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxmr\" (UniqueName: \"kubernetes.io/projected/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-kube-api-access-gmxmr\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.871997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.871858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.872451 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.872005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6e3b3b5e-7dd7-421c-9733-f304050ddbce-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.872451 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.872037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.872451 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.872188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.872451 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.872218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.872451 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.872252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.972933 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.972883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af33177a-4a99-4b44-8427-ff5e05da026f-serving-cert\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.973102 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.972936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2744f4f-889f-4833-82ac-129e28488162-serving-cert\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:39.973102 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.973102 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.973280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn5l\" (UniqueName: \"kubernetes.io/projected/970788cb-b97f-467f-bd8e-69787c8efef5-kube-api-access-gtn5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.973280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-tmp\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.973280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-snapshots\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.973280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.973280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9dqn\" (UniqueName: \"kubernetes.io/projected/5690f876-3daa-4cdb-bfe2-53f9544cadae-kube-api-access-j9dqn\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970788cb-b97f-467f-bd8e-69787c8efef5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkvm\" (UniqueName: \"kubernetes.io/projected/6e3b3b5e-7dd7-421c-9733-f304050ddbce-kube-api-access-5xkvm\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n69r2\" (UniqueName: \"kubernetes.io/projected/cef5ddfd-948f-4294-a2d2-9123e23feea6-kube-api-access-n69r2\") pod \"volume-data-source-validator-7c6cbb6c87-nqvgv\" (UID: \"cef5ddfd-948f-4294-a2d2-9123e23feea6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.973997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smh2t\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974158 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-config\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-config\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d997e103-1c8d-4bb4-a579-2d6b344c089f-tmp-dir\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:39.974907 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.974907 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-service-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.974907 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.974718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.975163 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970788cb-b97f-467f-bd8e-69787c8efef5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.975266 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.975319 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:39.975366 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrlp\" (UniqueName: \"kubernetes.io/projected/d997e103-1c8d-4bb4-a579-2d6b344c089f-kube-api-access-lbrlp\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:39.975415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxmr\" (UniqueName: \"kubernetes.io/projected/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-kube-api-access-gmxmr\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.975484 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.975556 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.975538 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:39.975613 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.975560 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:39.975613 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.975718 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.975626 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.475596845 +0000 UTC m=+34.048109760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:39.975718 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7hl\" (UniqueName: \"kubernetes.io/projected/af33177a-4a99-4b44-8427-ff5e05da026f-kube-api-access-vz7hl\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.975718 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.975892 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.975739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.976191 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.976165 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:39.976276 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.976193 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:39.976276 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.976256 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.476233783 +0000 UTC m=+34.048746696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:39.976395 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.976395 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7fh9\" (UniqueName: \"kubernetes.io/projected/2b95d992-bc38-4499-9a6e-ff4e0571a154-kube-api-access-j7fh9\") pod \"network-check-source-8894fc9bd-7rh2n\" (UID: \"2b95d992-bc38-4499-9a6e-ff4e0571a154\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" Apr 23 16:34:39.976500 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:39.976500 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-trusted-ca\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:39.976500 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d997e103-1c8d-4bb4-a579-2d6b344c089f-config-volume\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:39.976631 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.976631 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6e3b3b5e-7dd7-421c-9733-f304050ddbce-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.976721 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.976721 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5bm\" (UniqueName: \"kubernetes.io/projected/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-kube-api-access-cz5bm\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:39.976801 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5690f876-3daa-4cdb-bfe2-53f9544cadae-tmp\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:39.976801 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.976719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.977163 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:39.977246 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.977246 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.977335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:39.977335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6f5w\" (UniqueName: \"kubernetes.io/projected/b4778171-44e0-4227-8cef-29899b536604-kube-api-access-s6f5w\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:39.977518 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.977503 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:39.977565 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgjq\" (UniqueName: \"kubernetes.io/projected/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-kube-api-access-8qgjq\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:39.977666 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.977647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.977738 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.977726 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.477693759 +0000 UTC m=+34.050206664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:39.978218 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.978195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.978686 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.978665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.979214 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.979160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.979675 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.979646 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:39.979769 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.979703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.980735 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.980735 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6e3b3b5e-7dd7-421c-9733-f304050ddbce-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.980735 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.980735 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.980735 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-stats-auth\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.981042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.981042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92lh\" (UniqueName: \"kubernetes.io/projected/fa0af365-ad6e-4695-bd7d-c6838cbcf027-kube-api-access-q92lh\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.981042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.980876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970788cb-b97f-467f-bd8e-69787c8efef5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.981201 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.981028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-default-certificate\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:39.981201 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:39.981077 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.481046498 +0000 UTC m=+34.053559400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:39.981201 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.981153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:39.981369 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.981235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2b7\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.981369 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.981298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlfh\" (UniqueName: \"kubernetes.io/projected/d2744f4f-889f-4833-82ac-129e28488162-kube-api-access-tvlfh\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:39.981648 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.981625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8l7\" (UniqueName: \"kubernetes.io/projected/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-kube-api-access-rz8l7\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.982900 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.982828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.982900 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.982865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5690f876-3daa-4cdb-bfe2-53f9544cadae-klusterlet-config\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:39.983063 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.982932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-ca\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.983063 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.982953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:39.983782 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.983431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.984053 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.983931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.985069 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.984347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.985069 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.984662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.985069 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.984793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.986276 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.986237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970788cb-b97f-467f-bd8e-69787c8efef5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.986575 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.986532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn5l\" (UniqueName: \"kubernetes.io/projected/970788cb-b97f-467f-bd8e-69787c8efef5-kube-api-access-gtn5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-wg4fk\" (UID: \"970788cb-b97f-467f-bd8e-69787c8efef5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:39.986705 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.986647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.993757 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.993733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69r2\" (UniqueName: \"kubernetes.io/projected/cef5ddfd-948f-4294-a2d2-9123e23feea6-kube-api-access-n69r2\") pod \"volume-data-source-validator-7c6cbb6c87-nqvgv\" (UID: \"cef5ddfd-948f-4294-a2d2-9123e23feea6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" Apr 23 16:34:39.993859 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.993836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:39.995052 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.995002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2b7\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:39.995627 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.995601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkvm\" (UniqueName: \"kubernetes.io/projected/6e3b3b5e-7dd7-421c-9733-f304050ddbce-kube-api-access-5xkvm\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:39.995930 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.995892 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxmr\" (UniqueName: \"kubernetes.io/projected/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-kube-api-access-gmxmr\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:39.996031 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.995970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8l7\" (UniqueName: \"kubernetes.io/projected/0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d-kube-api-access-rz8l7\") pod \"cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z\" (UID: \"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:39.997424 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:39.997393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smh2t\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:40.026497 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.026468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:40.029441 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.029415 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4csk9\"" Apr 23 16:34:40.081711 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.081682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:34:40.083591 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlfh\" (UniqueName: \"kubernetes.io/projected/d2744f4f-889f-4833-82ac-129e28488162-kube-api-access-tvlfh\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.083717 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5690f876-3daa-4cdb-bfe2-53f9544cadae-klusterlet-config\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.083717 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:40.083717 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af33177a-4a99-4b44-8427-ff5e05da026f-serving-cert\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.083717 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2744f4f-889f-4833-82ac-129e28488162-serving-cert\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.083933 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-tmp\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.083992 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-snapshots\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.083992 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.083971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9dqn\" (UniqueName: \"kubernetes.io/projected/5690f876-3daa-4cdb-bfe2-53f9544cadae-kube-api-access-j9dqn\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.084089 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.084089 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-config\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.084089 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-config\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.084228 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:40.084228 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d997e103-1c8d-4bb4-a579-2d6b344c089f-tmp-dir\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.084228 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-service-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.084228 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:40.084408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrlp\" (UniqueName: \"kubernetes.io/projected/d997e103-1c8d-4bb4-a579-2d6b344c089f-kube-api-access-lbrlp\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.084408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7hl\" (UniqueName: \"kubernetes.io/projected/af33177a-4a99-4b44-8427-ff5e05da026f-kube-api-access-vz7hl\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.084408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7fh9\" (UniqueName: \"kubernetes.io/projected/2b95d992-bc38-4499-9a6e-ff4e0571a154-kube-api-access-j7fh9\") pod \"network-check-source-8894fc9bd-7rh2n\" (UID: \"2b95d992-bc38-4499-9a6e-ff4e0571a154\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" Apr 23 16:34:40.084408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.084408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-trusted-ca\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d997e103-1c8d-4bb4-a579-2d6b344c089f-config-volume\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5bm\" (UniqueName: \"kubernetes.io/projected/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-kube-api-access-cz5bm\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5690f876-3daa-4cdb-bfe2-53f9544cadae-tmp\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6f5w\" (UniqueName: \"kubernetes.io/projected/b4778171-44e0-4227-8cef-29899b536604-kube-api-access-s6f5w\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:40.084646 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgjq\" (UniqueName: \"kubernetes.io/projected/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-kube-api-access-8qgjq\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-stats-auth\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084731 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92lh\" (UniqueName: \"kubernetes.io/projected/fa0af365-ad6e-4695-bd7d-c6838cbcf027-kube-api-access-q92lh\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-default-certificate\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.084821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.085060 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.085029 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:40.085340 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.085101 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.585081285 +0000 UTC m=+34.157594203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:40.085469 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.085445 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:40.085581 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.085513 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.58548884 +0000 UTC m=+34.158001745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:40.086595 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.086250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-config\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.086595 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.086362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:40.086595 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.086361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.086814 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.086668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af33177a-4a99-4b44-8427-ff5e05da026f-serving-cert\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.086814 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.086741 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.586723291 +0000 UTC m=+34.159236197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.086970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-snapshots\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-trusted-ca\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2744f4f-889f-4833-82ac-129e28488162-serving-cert\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2744f4f-889f-4833-82ac-129e28488162-config\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af33177a-4a99-4b44-8427-ff5e05da026f-tmp\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.087634 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.087676 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.587662947 +0000 UTC m=+34.160175846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d997e103-1c8d-4bb4-a579-2d6b344c089f-tmp-dir\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.087790 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.087850 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:40.587832597 +0000 UTC m=+34.160345512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087950 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5690f876-3daa-4cdb-bfe2-53f9544cadae-tmp\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d997e103-1c8d-4bb4-a579-2d6b344c089f-config-volume\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.088107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.087981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af33177a-4a99-4b44-8427-ff5e05da026f-service-ca-bundle\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.089132 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.089106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5690f876-3daa-4cdb-bfe2-53f9544cadae-klusterlet-config\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.089401 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.089377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:40.089669 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.089645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-stats-auth\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.090165 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.090141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-default-certificate\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.090256 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.090215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.095777 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.095580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" Apr 23 16:34:40.108249 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.108219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgjq\" (UniqueName: \"kubernetes.io/projected/dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799-kube-api-access-8qgjq\") pod \"service-ca-operator-d6fc45fc5-5xt7q\" (UID: \"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.109867 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.109817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6f5w\" (UniqueName: \"kubernetes.io/projected/b4778171-44e0-4227-8cef-29899b536604-kube-api-access-s6f5w\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:40.110623 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.110544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvlfh\" (UniqueName: \"kubernetes.io/projected/d2744f4f-889f-4833-82ac-129e28488162-kube-api-access-tvlfh\") pod \"console-operator-9d4b6777b-bwh5t\" (UID: \"d2744f4f-889f-4833-82ac-129e28488162\") " pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.111009 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.110908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9dqn\" (UniqueName: \"kubernetes.io/projected/5690f876-3daa-4cdb-bfe2-53f9544cadae-kube-api-access-j9dqn\") pod \"klusterlet-addon-workmgr-66d676f5f7-7b8cr\" (UID: \"5690f876-3daa-4cdb-bfe2-53f9544cadae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.111126 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.111104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrlp\" (UniqueName: \"kubernetes.io/projected/d997e103-1c8d-4bb4-a579-2d6b344c089f-kube-api-access-lbrlp\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.111946 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.111899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7fh9\" (UniqueName: \"kubernetes.io/projected/2b95d992-bc38-4499-9a6e-ff4e0571a154-kube-api-access-j7fh9\") pod \"network-check-source-8894fc9bd-7rh2n\" (UID: \"2b95d992-bc38-4499-9a6e-ff4e0571a154\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" Apr 23 16:34:40.112336 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.112307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5bm\" (UniqueName: \"kubernetes.io/projected/4b7746a0-7bbc-43fc-86d8-43f0e52d77c5-kube-api-access-cz5bm\") pod \"managed-serviceaccount-addon-agent-cdfd48895-qqsc9\" (UID: \"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:40.112716 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.112698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92lh\" (UniqueName: \"kubernetes.io/projected/fa0af365-ad6e-4695-bd7d-c6838cbcf027-kube-api-access-q92lh\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.112963 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.112945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7hl\" (UniqueName: \"kubernetes.io/projected/af33177a-4a99-4b44-8427-ff5e05da026f-kube-api-access-vz7hl\") pod \"insights-operator-585dfdc468-lfp22\" (UID: \"af33177a-4a99-4b44-8427-ff5e05da026f\") " pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.135577 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.135431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" Apr 23 16:34:40.152085 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.151075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:34:40.156945 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.156868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lfp22" Apr 23 16:34:40.170307 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.169944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" Apr 23 16:34:40.186950 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.186681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" Apr 23 16:34:40.189306 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.189277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" Apr 23 16:34:40.213793 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.213326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:40.326854 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.323128 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z"] Apr 23 16:34:40.327270 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.327214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv"] Apr 23 16:34:40.387275 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.387211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk"] Apr 23 16:34:40.402688 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.402336 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod970788cb_b97f_467f_bd8e_69787c8efef5.slice/crio-14f8494bea04fd3384f7a83c1f3fe45cbd84850904b954c6fef73fb2e8d61f01 WatchSource:0}: Error finding container 14f8494bea04fd3384f7a83c1f3fe45cbd84850904b954c6fef73fb2e8d61f01: Status 404 returned error can't find the container with id 14f8494bea04fd3384f7a83c1f3fe45cbd84850904b954c6fef73fb2e8d61f01 Apr 23 16:34:40.415166 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.415110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q"] Apr 23 16:34:40.421873 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.421732 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bwh5t"] Apr 23 16:34:40.426972 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.426938 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2744f4f_889f_4833_82ac_129e28488162.slice/crio-98c86f3460cea399642838f9242f07fae6524fb88586453dfd2b9ba0660c7d5d WatchSource:0}: Error finding container 98c86f3460cea399642838f9242f07fae6524fb88586453dfd2b9ba0660c7d5d: Status 404 returned error can't find the container with id 98c86f3460cea399642838f9242f07fae6524fb88586453dfd2b9ba0660c7d5d Apr 23 16:34:40.447537 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.447298 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lfp22"] Apr 23 16:34:40.451940 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.451877 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf33177a_4a99_4b44_8427_ff5e05da026f.slice/crio-2031f9f435372828e1208192019113d8c57d45208632ff6ea0d849709713841d WatchSource:0}: Error finding container 2031f9f435372828e1208192019113d8c57d45208632ff6ea0d849709713841d: Status 404 returned error can't find the container with id 2031f9f435372828e1208192019113d8c57d45208632ff6ea0d849709713841d Apr 23 16:34:40.480186 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.479845 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr"] Apr 23 16:34:40.483405 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.482659 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9"] Apr 23 16:34:40.483405 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.482698 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5690f876_3daa_4cdb_bfe2_53f9544cadae.slice/crio-8f3f9f1cb8790109e9d625dbc777a880809b3a9079a50fbb662535dcb5e09292 WatchSource:0}: Error finding container 8f3f9f1cb8790109e9d625dbc777a880809b3a9079a50fbb662535dcb5e09292: Status 404 returned error can't find the container with id 8f3f9f1cb8790109e9d625dbc777a880809b3a9079a50fbb662535dcb5e09292 Apr 23 16:34:40.487053 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.487025 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b7746a0_7bbc_43fc_86d8_43f0e52d77c5.slice/crio-97fa678eba346b9cd533552b8ea03ffc9dba4201adb7f637fe1495a706588b5d WatchSource:0}: Error finding container 97fa678eba346b9cd533552b8ea03ffc9dba4201adb7f637fe1495a706588b5d: Status 404 returned error can't find the container with id 97fa678eba346b9cd533552b8ea03ffc9dba4201adb7f637fe1495a706588b5d Apr 23 16:34:40.488449 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.488429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488547 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.488586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488601 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.488583467 +0000 UTC m=+35.061096380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488651 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.488651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.488684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:40.488700 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488699 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.488686798 +0000 UTC m=+35.061199717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488770 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488782 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488801 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488812 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488813 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.488802547 +0000 UTC m=+35.061315456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:40.489070 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.488852 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.4888409 +0000 UTC m=+35.061353798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:40.494764 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.494745 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n"] Apr 23 16:34:40.497726 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:40.497701 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b95d992_bc38_4499_9a6e_ff4e0571a154.slice/crio-1010fbdd96e51f36832ec41405d67f51075f884680a953c829d76e31842d0911 WatchSource:0}: Error finding container 1010fbdd96e51f36832ec41405d67f51075f884680a953c829d76e31842d0911: Status 404 returned error can't find the container with id 1010fbdd96e51f36832ec41405d67f51075f884680a953c829d76e31842d0911 Apr 23 16:34:40.589853 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.589813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.590031 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.589872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:40.590031 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.589953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:40.590031 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590013 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.589994734 +0000 UTC m=+35.162507678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.590038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590067 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.590088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590104 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.59009327 +0000 UTC m=+35.162606175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590143 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590174 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.590164537 +0000 UTC m=+35.162677453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:40.590212 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590189 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:40.590616 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590221 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.590211336 +0000 UTC m=+35.162724248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:40.590616 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590226 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:40.590616 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.590253 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:41.590244501 +0000 UTC m=+35.162757407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:40.792121 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.792088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:40.792629 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.792249 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:40.792629 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:40.792313 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs podName:eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.792294092 +0000 UTC m=+66.364807005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs") pod "network-metrics-daemon-glcj7" (UID: "eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:40.893477 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.893441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:40.897933 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.897893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sv2v\" (UniqueName: \"kubernetes.io/projected/1fcbd9bc-88ba-48d1-978b-f8e2585ab84c-kube-api-access-2sv2v\") pod \"network-check-target-d7t2c\" (UID: \"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c\") " pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:40.936412 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:40.936375 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:41.027085 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.027056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:34:41.030220 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.029963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:34:41.030220 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.029966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vvm92\"" Apr 23 16:34:41.092053 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.091961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d7t2c"] Apr 23 16:34:41.098236 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:41.098202 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fcbd9bc_88ba_48d1_978b_f8e2585ab84c.slice/crio-8502b4e20a300123f10b527cc814895e055a9a947ca4a38fe769cb926bae4a7f WatchSource:0}: Error finding container 8502b4e20a300123f10b527cc814895e055a9a947ca4a38fe769cb926bae4a7f: Status 404 returned error can't find the container with id 8502b4e20a300123f10b527cc814895e055a9a947ca4a38fe769cb926bae4a7f Apr 23 16:34:41.173485 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.173447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lfp22" event={"ID":"af33177a-4a99-4b44-8427-ff5e05da026f","Type":"ContainerStarted","Data":"2031f9f435372828e1208192019113d8c57d45208632ff6ea0d849709713841d"} Apr 23 16:34:41.178843 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.178812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" event={"ID":"cef5ddfd-948f-4294-a2d2-9123e23feea6","Type":"ContainerStarted","Data":"93f4d22a324afa78b132e176606449723ce515bda0c748a886c07361cdbb0740"} Apr 23 16:34:41.182946 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.182859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerStarted","Data":"306d650157729f599ce92f2fa9cddba172b5b98c4ac85290f8b2e60d1311a929"} Apr 23 16:34:41.186821 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.186762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" event={"ID":"5690f876-3daa-4cdb-bfe2-53f9544cadae","Type":"ContainerStarted","Data":"8f3f9f1cb8790109e9d625dbc777a880809b3a9079a50fbb662535dcb5e09292"} Apr 23 16:34:41.189469 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.189405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" event={"ID":"2b95d992-bc38-4499-9a6e-ff4e0571a154","Type":"ContainerStarted","Data":"1010fbdd96e51f36832ec41405d67f51075f884680a953c829d76e31842d0911"} Apr 23 16:34:41.194484 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.194424 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" event={"ID":"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799","Type":"ContainerStarted","Data":"20f48c1bacc66be8f25b3bfcd3fc7e7c7d3afe773213ad8f5b888efbe395c4cd"} Apr 23 16:34:41.202576 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.202529 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" event={"ID":"d2744f4f-889f-4833-82ac-129e28488162","Type":"ContainerStarted","Data":"98c86f3460cea399642838f9242f07fae6524fb88586453dfd2b9ba0660c7d5d"} Apr 23 16:34:41.206981 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.206951 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" event={"ID":"970788cb-b97f-467f-bd8e-69787c8efef5","Type":"ContainerStarted","Data":"14f8494bea04fd3384f7a83c1f3fe45cbd84850904b954c6fef73fb2e8d61f01"} Apr 23 16:34:41.210200 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.210154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d7t2c" event={"ID":"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c","Type":"ContainerStarted","Data":"8502b4e20a300123f10b527cc814895e055a9a947ca4a38fe769cb926bae4a7f"} Apr 23 16:34:41.214057 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.214002 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" event={"ID":"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5","Type":"ContainerStarted","Data":"97fa678eba346b9cd533552b8ea03ffc9dba4201adb7f637fe1495a706588b5d"} Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.499816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.499983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.500030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.500059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500221 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500235 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500290 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.500273008 +0000 UTC m=+37.072785921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500706 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500757 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.500741757 +0000 UTC m=+37.073254662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500814 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500848 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.500836761 +0000 UTC m=+37.073349666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500906 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500934 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:41.501005 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.500962 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.500953332 +0000 UTC m=+37.073466236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:41.601265 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.601227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:41.601437 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.601306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:41.601498 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.601460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.601512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:41.601653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.601789 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.601869 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.601830148 +0000 UTC m=+37.174343052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602341 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602366 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.602350202 +0000 UTC m=+37.174863120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602388 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.602378932 +0000 UTC m=+37.174891836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602404 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602424 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602461 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.602449925 +0000 UTC m=+37.174962839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:41.602576 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:41.602490 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:43.60246964 +0000 UTC m=+37.174982548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:43.526045 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.525954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:43.526045 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.526027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.526056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526079 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.526120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526143 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.526124146 +0000 UTC m=+41.098637062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526190 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526234 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.526224131 +0000 UTC m=+41.098737037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526288 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526300 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526328 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.52631816 +0000 UTC m=+41.098831061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526380 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526391 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:43.526574 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.526420 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.526409387 +0000 UTC m=+41.098922291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.627103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.627169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.627220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.627262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:43.627306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627389 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.627339621 +0000 UTC m=+41.199852529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627477 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627517 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.627504687 +0000 UTC m=+41.200017585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627580 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627610 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.627600151 +0000 UTC m=+41.200113052 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627662 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627731 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627756 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.627718968 +0000 UTC m=+41.200231873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:43.627956 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:43.627774 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:47.627765086 +0000 UTC m=+41.200277988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:47.570655 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.570608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.570685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.570718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570775 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.570790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570847 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.570830158 +0000 UTC m=+49.143343061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570856 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570877 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570905 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570906 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570939 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.570909072 +0000 UTC m=+49.143421990 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570942 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570966 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.570951364 +0000 UTC m=+49.143464272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:47.571302 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.570994 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.570978423 +0000 UTC m=+49.143491343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:47.672104 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.672070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:47.672264 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.672125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:47.672264 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672216 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:47.672264 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672229 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:47.672387 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672271 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.672258303 +0000 UTC m=+49.244771202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:47.672387 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672298 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.672276862 +0000 UTC m=+49.244789763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:47.672387 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.672336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:47.672552 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.672395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:47.672552 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672474 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:47.672552 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:47.672509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:47.672552 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672532 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.672517528 +0000 UTC m=+49.245030446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:47.672732 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672591 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.672577465 +0000 UTC m=+49.245090364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:47.672732 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672644 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:47.672732 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:47.672689 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.67267747 +0000 UTC m=+49.245190378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:50.773249 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.770716 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-p5ms5"] Apr 23 16:34:50.782818 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.782788 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p5ms5"] Apr 23 16:34:50.782987 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.782934 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.786775 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.786744 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:34:50.803306 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.803283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ea279d1-4c47-464b-9c84-1868a227c6b6-original-pull-secret\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.803420 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.803374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-dbus\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.803465 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.803438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-kubelet-config\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.904125 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.904097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ea279d1-4c47-464b-9c84-1868a227c6b6-original-pull-secret\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.904295 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.904140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-dbus\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.904295 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.904174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-kubelet-config\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.904406 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.904310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-kubelet-config\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.904453 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.904405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9ea279d1-4c47-464b-9c84-1868a227c6b6-dbus\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:50.919442 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:50.919418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9ea279d1-4c47-464b-9c84-1868a227c6b6-original-pull-secret\") pod \"global-pull-secret-syncer-p5ms5\" (UID: \"9ea279d1-4c47-464b-9c84-1868a227c6b6\") " pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:51.094406 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:51.094337 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p5ms5" Apr 23 16:34:55.209926 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.209872 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p5ms5"] Apr 23 16:34:55.258472 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:55.258402 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea279d1_4c47_464b_9c84_1868a227c6b6.slice/crio-fb295e3265b520698e9fdd53e59b03a6357f8bacc448ded1899d05b814ac4d10 WatchSource:0}: Error finding container fb295e3265b520698e9fdd53e59b03a6357f8bacc448ded1899d05b814ac4d10: Status 404 returned error can't find the container with id fb295e3265b520698e9fdd53e59b03a6357f8bacc448ded1899d05b814ac4d10 Apr 23 16:34:55.278639 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.278608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p5ms5" event={"ID":"9ea279d1-4c47-464b-9c84-1868a227c6b6","Type":"ContainerStarted","Data":"fb295e3265b520698e9fdd53e59b03a6357f8bacc448ded1899d05b814ac4d10"} Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.649708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.649767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.649837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.649911 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.649947 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57798c5bd5-257tb: secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.649984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650010 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls podName:5460e905-1b76-4876-8552-7f9866fe1dc0 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.649989042 +0000 UTC m=+65.222501955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls") pod "image-registry-57798c5bd5-257tb" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0") : secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650084 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650136 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls podName:6e3b3b5e-7dd7-421c-9733-f304050ddbce nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.650121302 +0000 UTC m=+65.222634208 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d4nsk" (UID: "6e3b3b5e-7dd7-421c-9733-f304050ddbce") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650188 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650197 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d445c8494-5dps7: secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650228 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls podName:a1ccc661-cbd0-49b7-b172-5749b2c3e73f nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.650215867 +0000 UTC m=+65.222728773 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls") pod "image-registry-5d445c8494-5dps7" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f") : secret "image-registry-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650280 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:34:55.650335 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.650310 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls podName:6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.650300206 +0000 UTC m=+65.222813110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gc997" (UID: "6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb") : secret "samples-operator-tls" not found Apr 23 16:34:55.751410 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.751376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:34:55.751517 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.751441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:34:55.751517 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.751488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:55.751586 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751541 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:34:55.751618 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.751583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:34:55.751618 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751612 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert podName:b4778171-44e0-4227-8cef-29899b536604 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.751591369 +0000 UTC m=+65.324104275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert") pod "ingress-canary-hsn4f" (UID: "b4778171-44e0-4227-8cef-29899b536604") : secret "canary-serving-cert" not found Apr 23 16:34:55.751690 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:55.751667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:34:55.751690 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751679 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.751664955 +0000 UTC m=+65.324177860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : configmap references non-existent config key: service-ca.crt Apr 23 16:34:55.751757 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751738 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 16:34:55.751788 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751770 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert podName:860ae3e8-8bc5-4280-aefd-e5190c5e1db8 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.751760106 +0000 UTC m=+65.324273014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6pbrl" (UID: "860ae3e8-8bc5-4280-aefd-e5190c5e1db8") : secret "networking-console-plugin-cert" not found Apr 23 16:34:55.751824 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751812 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:34:55.751874 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751845 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls podName:d997e103-1c8d-4bb4-a579-2d6b344c089f nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.751834461 +0000 UTC m=+65.324347367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls") pod "dns-default-g9pgx" (UID: "d997e103-1c8d-4bb4-a579-2d6b344c089f") : secret "dns-default-metrics-tls" not found Apr 23 16:34:55.751937 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751896 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:34:55.751978 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:55.751941 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs podName:fa0af365-ad6e-4695-bd7d-c6838cbcf027 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.751931075 +0000 UTC m=+65.324443980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs") pod "router-default-77d69779b-8754l" (UID: "fa0af365-ad6e-4695-bd7d-c6838cbcf027") : secret "router-metrics-certs-default" not found Apr 23 16:34:56.297777 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.297733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" event={"ID":"5690f876-3daa-4cdb-bfe2-53f9544cadae","Type":"ContainerStarted","Data":"9df2bfe0b085635e597084c04632c6dadc6f7f5c9cb6797d3bf39b292f589d19"} Apr 23 16:34:56.298239 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.298095 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:56.299936 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.299887 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" Apr 23 16:34:56.300415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.300392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" event={"ID":"2b95d992-bc38-4499-9a6e-ff4e0571a154","Type":"ContainerStarted","Data":"b96a0ef2d87d2a18fc0675a60fefa60e339204519a922639b3c8395dd9a398ca"} Apr 23 16:34:56.302839 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.302802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" event={"ID":"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799","Type":"ContainerStarted","Data":"dc5ceee50bcd650226920117b0bf19414c77e9877afb928aaf9d3cba6de98523"} Apr 23 16:34:56.304805 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.304663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/0.log" Apr 23 16:34:56.304805 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.304708 2572 generic.go:358] "Generic (PLEG): container finished" podID="d2744f4f-889f-4833-82ac-129e28488162" containerID="b9935b499c5cf94b06b93e251838fd3a2c95241a71e44baeeed9cb50216e2a92" exitCode=255 Apr 23 16:34:56.304805 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.304772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" event={"ID":"d2744f4f-889f-4833-82ac-129e28488162","Type":"ContainerDied","Data":"b9935b499c5cf94b06b93e251838fd3a2c95241a71e44baeeed9cb50216e2a92"} Apr 23 16:34:56.305035 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.305024 2572 scope.go:117] "RemoveContainer" containerID="b9935b499c5cf94b06b93e251838fd3a2c95241a71e44baeeed9cb50216e2a92" Apr 23 16:34:56.306712 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.306685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" event={"ID":"970788cb-b97f-467f-bd8e-69787c8efef5","Type":"ContainerStarted","Data":"4c082f7cc93d0bb321216db2ad7d48bd5d76524e1c8bce9d88b1ea69d325cfcc"} Apr 23 16:34:56.316502 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.316450 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66d676f5f7-7b8cr" podStartSLOduration=34.728092454 podStartE2EDuration="49.316435802s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.485179595 +0000 UTC m=+34.057692499" lastFinishedPulling="2026-04-23 16:34:55.073522934 +0000 UTC m=+48.646035847" observedRunningTime="2026-04-23 16:34:56.316343107 +0000 UTC m=+49.888856029" watchObservedRunningTime="2026-04-23 16:34:56.316435802 +0000 UTC m=+49.888948724" Apr 23 16:34:56.317854 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.317278 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="aa3ee56ec472937388eeb569b0cb2b664a801aba78fbb8013b7960e759012ec7" exitCode=0 Apr 23 16:34:56.317854 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.317338 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"aa3ee56ec472937388eeb569b0cb2b664a801aba78fbb8013b7960e759012ec7"} Apr 23 16:34:56.331368 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.331279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d7t2c" event={"ID":"1fcbd9bc-88ba-48d1-978b-f8e2585ab84c","Type":"ContainerStarted","Data":"ee5b31ea38e12c4cade8571fc75020025cf0bd57fcadda9a97552108c714aa7b"} Apr 23 16:34:56.335527 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.335459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" event={"ID":"4b7746a0-7bbc-43fc-86d8-43f0e52d77c5","Type":"ContainerStarted","Data":"41d896f4485173b5889b4a1e6a942a8b4264764d3bba1ca2dce556f060d4374c"} Apr 23 16:34:56.339830 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.338822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lfp22" event={"ID":"af33177a-4a99-4b44-8427-ff5e05da026f","Type":"ContainerStarted","Data":"69efd057816db4b47e3c09191a885973652d7595c30c2b88c3b3ff82c88ed343"} Apr 23 16:34:56.339830 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.339149 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" podStartSLOduration=34.66988317 podStartE2EDuration="49.339136379s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.424643217 +0000 UTC m=+33.997156121" lastFinishedPulling="2026-04-23 16:34:55.093896417 +0000 UTC m=+48.666409330" observedRunningTime="2026-04-23 16:34:56.337742182 +0000 UTC m=+49.910255103" watchObservedRunningTime="2026-04-23 16:34:56.339136379 +0000 UTC m=+49.911649301" Apr 23 16:34:56.342986 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.342906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" event={"ID":"cef5ddfd-948f-4294-a2d2-9123e23feea6","Type":"ContainerStarted","Data":"cce9c59bb8b4e409f1a5e209b5c0b34cc90be576d070bbf0f60413568fe2c1bf"} Apr 23 16:34:56.346541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.346187 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:34:56.350975 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.350939 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerStarted","Data":"33eb1822c74dd306be8bb2d37bc9afdbc2ff355163ece991bde3f309eaf20ab7"} Apr 23 16:34:56.362815 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.357458 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" podStartSLOduration=34.69135588 podStartE2EDuration="49.357443902s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.407702179 +0000 UTC m=+33.980215097" lastFinishedPulling="2026-04-23 16:34:55.073790213 +0000 UTC m=+48.646303119" observedRunningTime="2026-04-23 16:34:56.356884624 +0000 UTC m=+49.929397547" watchObservedRunningTime="2026-04-23 16:34:56.357443902 +0000 UTC m=+49.929956824" Apr 23 16:34:56.458131 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.458076 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7rh2n" podStartSLOduration=34.863656017 podStartE2EDuration="49.458041974s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.499585491 +0000 UTC m=+34.072098393" lastFinishedPulling="2026-04-23 16:34:55.093971435 +0000 UTC m=+48.666484350" observedRunningTime="2026-04-23 16:34:56.427621169 +0000 UTC m=+50.000134089" watchObservedRunningTime="2026-04-23 16:34:56.458041974 +0000 UTC m=+50.030554896" Apr 23 16:34:56.458624 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.458592 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-lfp22" podStartSLOduration=34.839202409 podStartE2EDuration="49.458581678s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.454267393 +0000 UTC m=+34.026780299" lastFinishedPulling="2026-04-23 16:34:55.073646653 +0000 UTC m=+48.646159568" observedRunningTime="2026-04-23 16:34:56.457267423 +0000 UTC m=+50.029780344" watchObservedRunningTime="2026-04-23 16:34:56.458581678 +0000 UTC m=+50.031094605" Apr 23 16:34:56.479226 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.477274 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d7t2c" podStartSLOduration=35.472078015 podStartE2EDuration="49.477258385s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:41.10076639 +0000 UTC m=+34.673279293" lastFinishedPulling="2026-04-23 16:34:55.105946752 +0000 UTC m=+48.678459663" observedRunningTime="2026-04-23 16:34:56.476967299 +0000 UTC m=+50.049480222" watchObservedRunningTime="2026-04-23 16:34:56.477258385 +0000 UTC m=+50.049771306" Apr 23 16:34:56.500953 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.500626 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-cdfd48895-qqsc9" podStartSLOduration=34.893846766 podStartE2EDuration="49.500608996s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.488988229 +0000 UTC m=+34.061501132" lastFinishedPulling="2026-04-23 16:34:55.095750463 +0000 UTC m=+48.668263362" observedRunningTime="2026-04-23 16:34:56.499997993 +0000 UTC m=+50.072510915" watchObservedRunningTime="2026-04-23 16:34:56.500608996 +0000 UTC m=+50.073121917" Apr 23 16:34:56.555966 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:56.555627 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nqvgv" podStartSLOduration=34.91432642 podStartE2EDuration="49.555610454s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.341778566 +0000 UTC m=+33.914291479" lastFinishedPulling="2026-04-23 16:34:54.983062613 +0000 UTC m=+48.555575513" observedRunningTime="2026-04-23 16:34:56.554956292 +0000 UTC m=+50.127469212" watchObservedRunningTime="2026-04-23 16:34:56.555610454 +0000 UTC m=+50.128123378" Apr 23 16:34:57.358667 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.358631 2572 generic.go:358] "Generic (PLEG): container finished" podID="f24243b7-5732-41e7-a97d-ff3ef6a751d0" containerID="40b386e94334b92e6fb59416d46ec0e626c323619b0f4257acef6199e9e9bd2a" exitCode=0 Apr 23 16:34:57.359152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.358783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerDied","Data":"40b386e94334b92e6fb59416d46ec0e626c323619b0f4257acef6199e9e9bd2a"} Apr 23 16:34:57.361842 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.361821 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/1.log" Apr 23 16:34:57.362775 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.362755 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/0.log" Apr 23 16:34:57.362885 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.362792 2572 generic.go:358] "Generic (PLEG): container finished" podID="d2744f4f-889f-4833-82ac-129e28488162" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" exitCode=255 Apr 23 16:34:57.363762 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.363406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" event={"ID":"d2744f4f-889f-4833-82ac-129e28488162","Type":"ContainerDied","Data":"4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be"} Apr 23 16:34:57.363762 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.363462 2572 scope.go:117] "RemoveContainer" containerID="b9935b499c5cf94b06b93e251838fd3a2c95241a71e44baeeed9cb50216e2a92" Apr 23 16:34:57.364385 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:57.364364 2572 scope.go:117] "RemoveContainer" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" Apr 23 16:34:57.364551 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:57.364522 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bwh5t_openshift-console-operator(d2744f4f-889f-4833-82ac-129e28488162)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podUID="d2744f4f-889f-4833-82ac-129e28488162" Apr 23 16:34:58.038875 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.038836 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv"] Apr 23 16:34:58.043032 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.043004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" Apr 23 16:34:58.045671 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.045584 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:58.046812 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.046789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-8c6tp\"" Apr 23 16:34:58.046812 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.046802 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 16:34:58.051145 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.051119 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv"] Apr 23 16:34:58.183353 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.183316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5v7\" (UniqueName: \"kubernetes.io/projected/6fd373ce-6bf6-480e-b726-08169e5b5b2f-kube-api-access-kq5v7\") pod \"migrator-74bb7799d9-g4ksv\" (UID: \"6fd373ce-6bf6-480e-b726-08169e5b5b2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" Apr 23 16:34:58.285002 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.284757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5v7\" (UniqueName: \"kubernetes.io/projected/6fd373ce-6bf6-480e-b726-08169e5b5b2f-kube-api-access-kq5v7\") pod \"migrator-74bb7799d9-g4ksv\" (UID: \"6fd373ce-6bf6-480e-b726-08169e5b5b2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" Apr 23 16:34:58.295314 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.295290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5v7\" (UniqueName: \"kubernetes.io/projected/6fd373ce-6bf6-480e-b726-08169e5b5b2f-kube-api-access-kq5v7\") pod \"migrator-74bb7799d9-g4ksv\" (UID: \"6fd373ce-6bf6-480e-b726-08169e5b5b2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" Apr 23 16:34:58.355573 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.355541 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" Apr 23 16:34:58.368025 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.368000 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/1.log" Apr 23 16:34:58.368471 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.368450 2572 scope.go:117] "RemoveContainer" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" Apr 23 16:34:58.368721 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:34:58.368694 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bwh5t_openshift-console-operator(d2744f4f-889f-4833-82ac-129e28488162)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podUID="d2744f4f-889f-4833-82ac-129e28488162" Apr 23 16:34:58.373401 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.373360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r652k" event={"ID":"f24243b7-5732-41e7-a97d-ff3ef6a751d0","Type":"ContainerStarted","Data":"8592648b549f1c20ed4135b90c219fe6df40ef12c0cd1abbb3d537d49c477051"} Apr 23 16:34:58.375480 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.375459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerStarted","Data":"ad9ea5424d52a43407b79375ea9a59e709006fe2443d7c7cab48c9fb2f4ffcce"} Apr 23 16:34:58.375588 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.375490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerStarted","Data":"64dc56cdc3c57dfc3d987882dd83797e0878c5e4dcd1a8a90d4b399998680264"} Apr 23 16:34:58.407503 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.407417 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" podStartSLOduration=33.51754516 podStartE2EDuration="51.407399121s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.347624986 +0000 UTC m=+33.920137888" lastFinishedPulling="2026-04-23 16:34:58.237478948 +0000 UTC m=+51.809991849" observedRunningTime="2026-04-23 16:34:58.407035048 +0000 UTC m=+51.979547970" watchObservedRunningTime="2026-04-23 16:34:58.407399121 +0000 UTC m=+51.979912045" Apr 23 16:34:58.431749 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.431691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r652k" podStartSLOduration=6.07732649 podStartE2EDuration="51.431669035s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:09.746836457 +0000 UTC m=+3.319349366" lastFinishedPulling="2026-04-23 16:34:55.101179005 +0000 UTC m=+48.673691911" observedRunningTime="2026-04-23 16:34:58.430332806 +0000 UTC m=+52.002845739" watchObservedRunningTime="2026-04-23 16:34:58.431669035 +0000 UTC m=+52.004181958" Apr 23 16:34:58.496486 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:58.496442 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv"] Apr 23 16:34:58.498985 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:34:58.498900 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd373ce_6bf6_480e_b726_08169e5b5b2f.slice/crio-96ead9a65aedb5f411bc95a68e593e305ec087a71a902b599662d28f72f5145f WatchSource:0}: Error finding container 96ead9a65aedb5f411bc95a68e593e305ec087a71a902b599662d28f72f5145f: Status 404 returned error can't find the container with id 96ead9a65aedb5f411bc95a68e593e305ec087a71a902b599662d28f72f5145f Apr 23 16:34:59.287581 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.287553 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mqskj_7789b2de-75cc-4057-8981-8850b48ac765/dns-node-resolver/0.log" Apr 23 16:34:59.380505 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.380435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" event={"ID":"6fd373ce-6bf6-480e-b726-08169e5b5b2f","Type":"ContainerStarted","Data":"96ead9a65aedb5f411bc95a68e593e305ec087a71a902b599662d28f72f5145f"} Apr 23 16:34:59.445589 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.443331 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fdvpd"] Apr 23 16:34:59.447732 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.447707 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.450486 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.450453 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 16:34:59.450576 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.450510 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 16:34:59.452086 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.452064 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 16:34:59.452199 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.452128 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 16:34:59.452397 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.452348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8sxm7\"" Apr 23 16:34:59.454258 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.454239 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fdvpd"] Apr 23 16:34:59.497475 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.497430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-key\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.497688 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.497667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswqt\" (UniqueName: \"kubernetes.io/projected/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-kube-api-access-jswqt\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.498147 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.498101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-cabundle\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.598899 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.598807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-cabundle\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.598899 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.598897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-key\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.599140 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.599005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jswqt\" (UniqueName: \"kubernetes.io/projected/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-kube-api-access-jswqt\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.599590 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.599560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-cabundle\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.602045 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.602015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-signing-key\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.608371 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.608343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswqt\" (UniqueName: \"kubernetes.io/projected/181c9cb6-13b8-4b3d-b48a-6ec90b66d746-kube-api-access-jswqt\") pod \"service-ca-865cb79987-fdvpd\" (UID: \"181c9cb6-13b8-4b3d-b48a-6ec90b66d746\") " pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.759332 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.759297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fdvpd" Apr 23 16:34:59.886700 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:34:59.886627 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dp4mb_10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa/node-ca/0.log" Apr 23 16:35:00.152954 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:00.152862 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:00.152954 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:00.152894 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:00.153351 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:00.153334 2572 scope.go:117] "RemoveContainer" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" Apr 23 16:35:00.153541 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:00.153522 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bwh5t_openshift-console-operator(d2744f4f-889f-4833-82ac-129e28488162)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podUID="d2744f4f-889f-4833-82ac-129e28488162" Apr 23 16:35:00.840940 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:00.840898 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fdvpd"] Apr 23 16:35:00.843496 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:00.843473 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181c9cb6_13b8_4b3d_b48a_6ec90b66d746.slice/crio-f354f77853832c38293ab5093820611f2181127c626591d5342e3ab17099a758 WatchSource:0}: Error finding container f354f77853832c38293ab5093820611f2181127c626591d5342e3ab17099a758: Status 404 returned error can't find the container with id f354f77853832c38293ab5093820611f2181127c626591d5342e3ab17099a758 Apr 23 16:35:01.388702 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.388656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p5ms5" event={"ID":"9ea279d1-4c47-464b-9c84-1868a227c6b6","Type":"ContainerStarted","Data":"e7c5015fa2936c843b224086645d30037b30c445f5288292065c6b8f566a53a1"} Apr 23 16:35:01.389995 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.389965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fdvpd" event={"ID":"181c9cb6-13b8-4b3d-b48a-6ec90b66d746","Type":"ContainerStarted","Data":"0af6d53b924933b2e71a9c3d48ff925079f33e4a3cd7493957bbdb60d35a6d57"} Apr 23 16:35:01.390122 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.390003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fdvpd" event={"ID":"181c9cb6-13b8-4b3d-b48a-6ec90b66d746","Type":"ContainerStarted","Data":"f354f77853832c38293ab5093820611f2181127c626591d5342e3ab17099a758"} Apr 23 16:35:01.391377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.391351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" event={"ID":"6fd373ce-6bf6-480e-b726-08169e5b5b2f","Type":"ContainerStarted","Data":"936af71073793fa15347e7b249920734d3d72591f582910445bdf74e17e2b218"} Apr 23 16:35:01.391377 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.391379 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" event={"ID":"6fd373ce-6bf6-480e-b726-08169e5b5b2f","Type":"ContainerStarted","Data":"6166d12d8075222a7118cdc140c02d26ce7a48edd9ec09042ee600ba22e897bb"} Apr 23 16:35:01.408317 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.408273 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-p5ms5" podStartSLOduration=5.964236215 podStartE2EDuration="11.408260681s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:55.269505252 +0000 UTC m=+48.842018166" lastFinishedPulling="2026-04-23 16:35:00.713529719 +0000 UTC m=+54.286042632" observedRunningTime="2026-04-23 16:35:01.407989732 +0000 UTC m=+54.980502664" watchObservedRunningTime="2026-04-23 16:35:01.408260681 +0000 UTC m=+54.980773600" Apr 23 16:35:01.429477 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.429434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fdvpd" podStartSLOduration=2.429422874 podStartE2EDuration="2.429422874s" podCreationTimestamp="2026-04-23 16:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:01.428712885 +0000 UTC m=+55.001225807" watchObservedRunningTime="2026-04-23 16:35:01.429422874 +0000 UTC m=+55.001935795" Apr 23 16:35:01.447210 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:01.447169 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g4ksv" podStartSLOduration=1.240322363 podStartE2EDuration="3.447158189s" podCreationTimestamp="2026-04-23 16:34:58 +0000 UTC" firstStartedPulling="2026-04-23 16:34:58.50125124 +0000 UTC m=+52.073764139" lastFinishedPulling="2026-04-23 16:35:00.708087052 +0000 UTC m=+54.280599965" observedRunningTime="2026-04-23 16:35:01.446361709 +0000 UTC m=+55.018874629" watchObservedRunningTime="2026-04-23 16:35:01.447158189 +0000 UTC m=+55.019671110" Apr 23 16:35:04.177485 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:04.177454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95t7c" Apr 23 16:35:11.026608 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.026579 2572 scope.go:117] "RemoveContainer" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" Apr 23 16:35:11.420510 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.420472 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:35:11.420927 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.420898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/1.log" Apr 23 16:35:11.420984 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.420950 2572 generic.go:358] "Generic (PLEG): container finished" podID="d2744f4f-889f-4833-82ac-129e28488162" containerID="a22aaf7ac455b388f4e9526067fafbda4a3623627b8864c80ffaefe8aab94256" exitCode=255 Apr 23 16:35:11.421018 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.421008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" event={"ID":"d2744f4f-889f-4833-82ac-129e28488162","Type":"ContainerDied","Data":"a22aaf7ac455b388f4e9526067fafbda4a3623627b8864c80ffaefe8aab94256"} Apr 23 16:35:11.421052 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.421039 2572 scope.go:117] "RemoveContainer" containerID="4fc81381196d362a55f2f7e043b7bd3c325798561b9a7bc9ca7068d23e8512be" Apr 23 16:35:11.421333 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.421316 2572 scope.go:117] "RemoveContainer" containerID="a22aaf7ac455b388f4e9526067fafbda4a3623627b8864c80ffaefe8aab94256" Apr 23 16:35:11.421518 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:11.421498 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-bwh5t_openshift-console-operator(d2744f4f-889f-4833-82ac-129e28488162)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podUID="d2744f4f-889f-4833-82ac-129e28488162" Apr 23 16:35:11.710044 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.709959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:35:11.710044 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.710007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:11.710044 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.710027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:35:11.710252 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.710091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:35:11.712603 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.712568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gc997\" (UID: \"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:35:11.712728 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.712568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"image-registry-57798c5bd5-257tb\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:11.712840 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.712820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"image-registry-5d445c8494-5dps7\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:35:11.712949 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.712910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e3b3b5e-7dd7-421c-9733-f304050ddbce-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d4nsk\" (UID: \"6e3b3b5e-7dd7-421c-9733-f304050ddbce\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:35:11.811409 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.811379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:11.811564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.811421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:35:11.811564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.811448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:35:11.811564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.811477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:35:11.811564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.811502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:11.812107 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.812079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa0af365-ad6e-4695-bd7d-c6838cbcf027-service-ca-bundle\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:11.814439 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.814415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d997e103-1c8d-4bb4-a579-2d6b344c089f-metrics-tls\") pod \"dns-default-g9pgx\" (UID: \"d997e103-1c8d-4bb4-a579-2d6b344c089f\") " pod="openshift-dns/dns-default-g9pgx" Apr 23 16:35:11.814524 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.814498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/860ae3e8-8bc5-4280-aefd-e5190c5e1db8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6pbrl\" (UID: \"860ae3e8-8bc5-4280-aefd-e5190c5e1db8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:35:11.814582 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.814552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0af365-ad6e-4695-bd7d-c6838cbcf027-metrics-certs\") pod \"router-default-77d69779b-8754l\" (UID: \"fa0af365-ad6e-4695-bd7d-c6838cbcf027\") " pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:11.814785 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.814765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4778171-44e0-4227-8cef-29899b536604-cert\") pod \"ingress-canary-hsn4f\" (UID: \"b4778171-44e0-4227-8cef-29899b536604\") " pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:35:11.905719 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.905694 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q6vbq\"" Apr 23 16:35:11.914110 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.914072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:35:11.918865 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.918847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:11.931157 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.931137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2b8hh\"" Apr 23 16:35:11.938737 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.938720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" Apr 23 16:35:11.946050 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.946025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-s82cg\"" Apr 23 16:35:11.954133 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.954041 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" Apr 23 16:35:11.965434 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.965411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zfkkk\"" Apr 23 16:35:11.973136 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.972898 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" Apr 23 16:35:11.985423 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.984836 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bkcjp\"" Apr 23 16:35:11.997108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:11.995426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:12.008706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.008684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nghfn\"" Apr 23 16:35:12.021420 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.019051 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hsn4f" Apr 23 16:35:12.023541 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.023227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tz9n7\"" Apr 23 16:35:12.037704 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.037150 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g9pgx" Apr 23 16:35:12.199847 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.197171 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:35:12.209259 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.209188 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ccc661_cbd0_49b7_b172_5749b2c3e73f.slice/crio-92982dca6cfa6b75c69304d1a76c75d22fd64be8b06ce3723c4e5c520fca08f6 WatchSource:0}: Error finding container 92982dca6cfa6b75c69304d1a76c75d22fd64be8b06ce3723c4e5c520fca08f6: Status 404 returned error can't find the container with id 92982dca6cfa6b75c69304d1a76c75d22fd64be8b06ce3723c4e5c520fca08f6 Apr 23 16:35:12.210359 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.210305 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:35:12.257308 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.257255 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997"] Apr 23 16:35:12.291895 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.291866 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl"] Apr 23 16:35:12.297279 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.297243 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860ae3e8_8bc5_4280_aefd_e5190c5e1db8.slice/crio-3ae7f1b9f78dd7c72ee8ff84c52923b59590c934f1abd6de21ce977bcab6e793 WatchSource:0}: Error finding container 3ae7f1b9f78dd7c72ee8ff84c52923b59590c934f1abd6de21ce977bcab6e793: Status 404 returned error can't find the container with id 3ae7f1b9f78dd7c72ee8ff84c52923b59590c934f1abd6de21ce977bcab6e793 Apr 23 16:35:12.352363 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.352341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hsn4f"] Apr 23 16:35:12.355177 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.355155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4778171_44e0_4227_8cef_29899b536604.slice/crio-17442cc233426596c5b70e8d89d697b2611d3d87637df777d83209b6c11f9b81 WatchSource:0}: Error finding container 17442cc233426596c5b70e8d89d697b2611d3d87637df777d83209b6c11f9b81: Status 404 returned error can't find the container with id 17442cc233426596c5b70e8d89d697b2611d3d87637df777d83209b6c11f9b81 Apr 23 16:35:12.424969 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.424950 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:35:12.426070 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.426041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" event={"ID":"860ae3e8-8bc5-4280-aefd-e5190c5e1db8","Type":"ContainerStarted","Data":"3ae7f1b9f78dd7c72ee8ff84c52923b59590c934f1abd6de21ce977bcab6e793"} Apr 23 16:35:12.427547 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.427512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" event={"ID":"a1ccc661-cbd0-49b7-b172-5749b2c3e73f","Type":"ContainerStarted","Data":"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689"} Apr 23 16:35:12.427669 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.427592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" event={"ID":"a1ccc661-cbd0-49b7-b172-5749b2c3e73f","Type":"ContainerStarted","Data":"92982dca6cfa6b75c69304d1a76c75d22fd64be8b06ce3723c4e5c520fca08f6"} Apr 23 16:35:12.427731 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.427672 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:35:12.429253 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.429216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" event={"ID":"5460e905-1b76-4876-8552-7f9866fe1dc0","Type":"ContainerStarted","Data":"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81"} Apr 23 16:35:12.429353 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.429264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" event={"ID":"5460e905-1b76-4876-8552-7f9866fe1dc0","Type":"ContainerStarted","Data":"976c9181c969263244013b27231e1b3fcd47b646bbd369940c8eee21d9f7539b"} Apr 23 16:35:12.429353 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.429331 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:12.430343 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.430311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hsn4f" event={"ID":"b4778171-44e0-4227-8cef-29899b536604","Type":"ContainerStarted","Data":"17442cc233426596c5b70e8d89d697b2611d3d87637df777d83209b6c11f9b81"} Apr 23 16:35:12.431468 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.431449 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" event={"ID":"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb","Type":"ContainerStarted","Data":"fbb93433d1bd92d85a87055b79e67d537df5648a1990636794352b1cb1067e77"} Apr 23 16:35:12.448119 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.448077 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" podStartSLOduration=65.448063167 podStartE2EDuration="1m5.448063167s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:12.447244214 +0000 UTC m=+66.019757160" watchObservedRunningTime="2026-04-23 16:35:12.448063167 +0000 UTC m=+66.020576100" Apr 23 16:35:12.458108 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.458087 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk"] Apr 23 16:35:12.460969 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.460905 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e3b3b5e_7dd7_421c_9733_f304050ddbce.slice/crio-6604c8b1cc291956c9d20c83481eee80222620860102811e1474baa0cb46af8d WatchSource:0}: Error finding container 6604c8b1cc291956c9d20c83481eee80222620860102811e1474baa0cb46af8d: Status 404 returned error can't find the container with id 6604c8b1cc291956c9d20c83481eee80222620860102811e1474baa0cb46af8d Apr 23 16:35:12.477309 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.477243 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" podStartSLOduration=65.477231302 podStartE2EDuration="1m5.477231302s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:12.475803491 +0000 UTC m=+66.048316412" watchObservedRunningTime="2026-04-23 16:35:12.477231302 +0000 UTC m=+66.049744223" Apr 23 16:35:12.513432 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.513382 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g9pgx"] Apr 23 16:35:12.517181 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.517149 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-77d69779b-8754l"] Apr 23 16:35:12.518018 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.517974 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd997e103_1c8d_4bb4_a579_2d6b344c089f.slice/crio-870363627d66269551b38aa1e5e4ad7a73f2bd73648bcf626ea3bcec254e331e WatchSource:0}: Error finding container 870363627d66269551b38aa1e5e4ad7a73f2bd73648bcf626ea3bcec254e331e: Status 404 returned error can't find the container with id 870363627d66269551b38aa1e5e4ad7a73f2bd73648bcf626ea3bcec254e331e Apr 23 16:35:12.521647 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:12.520626 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0af365_ad6e_4695_bd7d_c6838cbcf027.slice/crio-c3dfa5f315ad926e8cc0385289785005ff6daaa9acff94c7715bf3c6bc4d036d WatchSource:0}: Error finding container c3dfa5f315ad926e8cc0385289785005ff6daaa9acff94c7715bf3c6bc4d036d: Status 404 returned error can't find the container with id c3dfa5f315ad926e8cc0385289785005ff6daaa9acff94c7715bf3c6bc4d036d Apr 23 16:35:12.822981 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.822892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:35:12.825602 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.825581 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:12.835976 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.835948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef-metrics-certs\") pod \"network-metrics-daemon-glcj7\" (UID: \"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef\") " pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:35:12.844796 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.844774 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vvm92\"" Apr 23 16:35:12.851997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:12.851980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-glcj7" Apr 23 16:35:13.001964 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.000314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-glcj7"] Apr 23 16:35:13.006995 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:13.004959 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeebbda0b_dcd9_4f01_af8f_d107d7a5e1ef.slice/crio-807c67a62c9302a107fcbfa5726cff750aaabe54aa4d6578116f6f5babc4fbd6 WatchSource:0}: Error finding container 807c67a62c9302a107fcbfa5726cff750aaabe54aa4d6578116f6f5babc4fbd6: Status 404 returned error can't find the container with id 807c67a62c9302a107fcbfa5726cff750aaabe54aa4d6578116f6f5babc4fbd6 Apr 23 16:35:13.436614 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.436573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g9pgx" event={"ID":"d997e103-1c8d-4bb4-a579-2d6b344c089f","Type":"ContainerStarted","Data":"870363627d66269551b38aa1e5e4ad7a73f2bd73648bcf626ea3bcec254e331e"} Apr 23 16:35:13.438765 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.438732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" event={"ID":"6e3b3b5e-7dd7-421c-9733-f304050ddbce","Type":"ContainerStarted","Data":"6604c8b1cc291956c9d20c83481eee80222620860102811e1474baa0cb46af8d"} Apr 23 16:35:13.443302 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.443275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-77d69779b-8754l" event={"ID":"fa0af365-ad6e-4695-bd7d-c6838cbcf027","Type":"ContainerStarted","Data":"c0754d29cff8c0b3b532704e5895522054429fa64d3e4e3ee02904594726e046"} Apr 23 16:35:13.443398 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.443314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-77d69779b-8754l" event={"ID":"fa0af365-ad6e-4695-bd7d-c6838cbcf027","Type":"ContainerStarted","Data":"c3dfa5f315ad926e8cc0385289785005ff6daaa9acff94c7715bf3c6bc4d036d"} Apr 23 16:35:13.448595 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.448078 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-glcj7" event={"ID":"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef","Type":"ContainerStarted","Data":"807c67a62c9302a107fcbfa5726cff750aaabe54aa4d6578116f6f5babc4fbd6"} Apr 23 16:35:13.469138 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.468190 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-77d69779b-8754l" podStartSLOduration=66.468175037 podStartE2EDuration="1m6.468175037s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:13.468081455 +0000 UTC m=+67.040594379" watchObservedRunningTime="2026-04-23 16:35:13.468175037 +0000 UTC m=+67.040687957" Apr 23 16:35:13.996706 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.996674 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:13.999814 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:13.999789 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:14.450984 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:14.450944 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:14.452288 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:14.452261 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-77d69779b-8754l" Apr 23 16:35:18.465577 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.465540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" event={"ID":"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb","Type":"ContainerStarted","Data":"1584be02f037b3f98c59f678242a03ccc4e349694af9d2899424050a5bd1ddcc"} Apr 23 16:35:18.465577 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.465583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" event={"ID":"6eb1bd1a-ce39-4a0a-a279-a7a3ff852abb","Type":"ContainerStarted","Data":"6b8edaa01c5f767b68f4afbe671aee459f4fd363320692afe98950846335a335"} Apr 23 16:35:18.467448 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.467369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g9pgx" event={"ID":"d997e103-1c8d-4bb4-a579-2d6b344c089f","Type":"ContainerStarted","Data":"45742658ece5b4f01b06f9e0d143cd1de28697878793ff34b13ac7f4eab291f8"} Apr 23 16:35:18.467448 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.467404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g9pgx" event={"ID":"d997e103-1c8d-4bb4-a579-2d6b344c089f","Type":"ContainerStarted","Data":"fd3c79cc6d472c88de69eee938473937ad920bf6bc5e58daf785866372ad7503"} Apr 23 16:35:18.467638 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.467514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g9pgx" Apr 23 16:35:18.468738 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.468715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" event={"ID":"6e3b3b5e-7dd7-421c-9733-f304050ddbce","Type":"ContainerStarted","Data":"f5dd1e75aa097754145fef17f7e8058beb2430d449248c297b0b00bb0553fbeb"} Apr 23 16:35:18.470059 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.470034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" event={"ID":"860ae3e8-8bc5-4280-aefd-e5190c5e1db8","Type":"ContainerStarted","Data":"e43b8015e54a1197a240542d4535cb721cbcbc6d63a1c951fa2601cfa82a6b9a"} Apr 23 16:35:18.471520 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.471500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-glcj7" event={"ID":"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef","Type":"ContainerStarted","Data":"5e09fd60a9ceb69423830562a0f23f76c77c2c33856b79bc4f95aa756e8a57c5"} Apr 23 16:35:18.471520 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.471523 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-glcj7" event={"ID":"eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef","Type":"ContainerStarted","Data":"fc35ca4999ae21405a239b76dee86972cf9a3281697cdc91bd2705f3801b6926"} Apr 23 16:35:18.472696 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.472678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hsn4f" event={"ID":"b4778171-44e0-4227-8cef-29899b536604","Type":"ContainerStarted","Data":"0f460b35a91aa684c27ed53158ca7d3c6e1ceec8727928645ed14c53cdc1f4bd"} Apr 23 16:35:18.490204 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.490168 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gc997" podStartSLOduration=66.055197088 podStartE2EDuration="1m11.490158058s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:35:12.353376243 +0000 UTC m=+65.925889142" lastFinishedPulling="2026-04-23 16:35:17.788337209 +0000 UTC m=+71.360850112" observedRunningTime="2026-04-23 16:35:18.489065724 +0000 UTC m=+72.061578645" watchObservedRunningTime="2026-04-23 16:35:18.490158058 +0000 UTC m=+72.062670979" Apr 23 16:35:18.521945 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.521891 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-glcj7" podStartSLOduration=66.741768335 podStartE2EDuration="1m11.521879825s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:35:13.008761341 +0000 UTC m=+66.581274245" lastFinishedPulling="2026-04-23 16:35:17.788872826 +0000 UTC m=+71.361385735" observedRunningTime="2026-04-23 16:35:18.520262985 +0000 UTC m=+72.092775905" watchObservedRunningTime="2026-04-23 16:35:18.521879825 +0000 UTC m=+72.094392745" Apr 23 16:35:18.543456 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.543410 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6pbrl" podStartSLOduration=63.055845955 podStartE2EDuration="1m8.543397953s" podCreationTimestamp="2026-04-23 16:34:10 +0000 UTC" firstStartedPulling="2026-04-23 16:35:12.300506639 +0000 UTC m=+65.873019541" lastFinishedPulling="2026-04-23 16:35:17.788058623 +0000 UTC m=+71.360571539" observedRunningTime="2026-04-23 16:35:18.542319202 +0000 UTC m=+72.114832124" watchObservedRunningTime="2026-04-23 16:35:18.543397953 +0000 UTC m=+72.115910873" Apr 23 16:35:18.563427 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.563355 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hsn4f" podStartSLOduration=34.131747889 podStartE2EDuration="39.56334359s" podCreationTimestamp="2026-04-23 16:34:39 +0000 UTC" firstStartedPulling="2026-04-23 16:35:12.356870106 +0000 UTC m=+65.929383009" lastFinishedPulling="2026-04-23 16:35:17.788465799 +0000 UTC m=+71.360978710" observedRunningTime="2026-04-23 16:35:18.562070224 +0000 UTC m=+72.134583160" watchObservedRunningTime="2026-04-23 16:35:18.56334359 +0000 UTC m=+72.135856512" Apr 23 16:35:18.582315 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.582257 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g9pgx" podStartSLOduration=34.314261716 podStartE2EDuration="39.582239171s" podCreationTimestamp="2026-04-23 16:34:39 +0000 UTC" firstStartedPulling="2026-04-23 16:35:12.520901185 +0000 UTC m=+66.093414099" lastFinishedPulling="2026-04-23 16:35:17.788878651 +0000 UTC m=+71.361391554" observedRunningTime="2026-04-23 16:35:18.580873339 +0000 UTC m=+72.153386258" watchObservedRunningTime="2026-04-23 16:35:18.582239171 +0000 UTC m=+72.154752093" Apr 23 16:35:18.607999 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:18.607935 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d4nsk" podStartSLOduration=66.277839996 podStartE2EDuration="1m11.607907336s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:35:12.462715103 +0000 UTC m=+66.035228002" lastFinishedPulling="2026-04-23 16:35:17.792782433 +0000 UTC m=+71.365295342" observedRunningTime="2026-04-23 16:35:18.606788572 +0000 UTC m=+72.179301493" watchObservedRunningTime="2026-04-23 16:35:18.607907336 +0000 UTC m=+72.180420256" Apr 23 16:35:20.152450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:20.152413 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:20.152450 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:20.152452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:20.152859 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:20.152796 2572 scope.go:117] "RemoveContainer" containerID="a22aaf7ac455b388f4e9526067fafbda4a3623627b8864c80ffaefe8aab94256" Apr 23 16:35:20.153008 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:20.152990 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-bwh5t_openshift-console-operator(d2744f4f-889f-4833-82ac-129e28488162)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podUID="d2744f4f-889f-4833-82ac-129e28488162" Apr 23 16:35:22.848221 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.848190 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c2hhr"] Apr 23 16:35:22.888344 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.888308 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:35:22.888344 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.888348 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c2hhr"] Apr 23 16:35:22.888556 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.888455 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.892825 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.892804 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:35:22.893370 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.893352 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nzf9q\"" Apr 23 16:35:22.893460 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.893359 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:35:22.909749 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.909727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-crio-socket\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.909840 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.909752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.909840 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.909803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prmh\" (UniqueName: \"kubernetes.io/projected/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-api-access-4prmh\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.909934 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.909854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-data-volume\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.909934 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.909901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:22.987866 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:22.987832 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc"] Apr 23 16:35:23.010834 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.010798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-crio-socket\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011004 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.010846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011004 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.010882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4prmh\" (UniqueName: \"kubernetes.io/projected/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-api-access-4prmh\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011004 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.010905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-crio-socket\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011004 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.010987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-data-volume\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011177 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.011035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011408 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.011381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-data-volume\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.011533 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.011511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.013645 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.013625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.014155 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.014140 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc"] Apr 23 16:35:23.014249 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.014241 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:23.019080 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.019057 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dqj86\"" Apr 23 16:35:23.019220 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.019057 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 16:35:23.027211 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.027192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prmh\" (UniqueName: \"kubernetes.io/projected/a0a3f4ea-23a5-4bdf-b548-d2b2314b583b-kube-api-access-4prmh\") pod \"insights-runtime-extractor-c2hhr\" (UID: \"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b\") " pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.112254 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.112139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a633b951-4a93-4560-a71d-1bc980e75563-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-m9crc\" (UID: \"a633b951-4a93-4560-a71d-1bc980e75563\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:23.211691 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.211655 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c2hhr" Apr 23 16:35:23.212753 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.212661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a633b951-4a93-4560-a71d-1bc980e75563-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-m9crc\" (UID: \"a633b951-4a93-4560-a71d-1bc980e75563\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:23.216280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.216257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a633b951-4a93-4560-a71d-1bc980e75563-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-m9crc\" (UID: \"a633b951-4a93-4560-a71d-1bc980e75563\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:23.329254 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.329225 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:23.338849 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.338820 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c2hhr"] Apr 23 16:35:23.343990 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:23.343964 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a3f4ea_23a5_4bdf_b548_d2b2314b583b.slice/crio-09fd0ebaf1dc39f2ddf6bccb4b4ae86195f9b450fd3c359ade3dfaa75d9e6158 WatchSource:0}: Error finding container 09fd0ebaf1dc39f2ddf6bccb4b4ae86195f9b450fd3c359ade3dfaa75d9e6158: Status 404 returned error can't find the container with id 09fd0ebaf1dc39f2ddf6bccb4b4ae86195f9b450fd3c359ade3dfaa75d9e6158 Apr 23 16:35:23.454717 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.454689 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc"] Apr 23 16:35:23.457473 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:23.457444 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda633b951_4a93_4560_a71d_1bc980e75563.slice/crio-4244e3e23218ec1706d287504f7657490553343abf7412f73f7be83a5b97d2c8 WatchSource:0}: Error finding container 4244e3e23218ec1706d287504f7657490553343abf7412f73f7be83a5b97d2c8: Status 404 returned error can't find the container with id 4244e3e23218ec1706d287504f7657490553343abf7412f73f7be83a5b97d2c8 Apr 23 16:35:23.490982 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.490955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" event={"ID":"a633b951-4a93-4560-a71d-1bc980e75563","Type":"ContainerStarted","Data":"4244e3e23218ec1706d287504f7657490553343abf7412f73f7be83a5b97d2c8"} Apr 23 16:35:23.492644 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.492620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2hhr" event={"ID":"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b","Type":"ContainerStarted","Data":"e276bd3e617546a3b9d837da033727c58e6ef6423f6567ede0ffc155bfc2d8fd"} Apr 23 16:35:23.492731 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:23.492649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2hhr" event={"ID":"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b","Type":"ContainerStarted","Data":"09fd0ebaf1dc39f2ddf6bccb4b4ae86195f9b450fd3c359ade3dfaa75d9e6158"} Apr 23 16:35:25.501381 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:25.501345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2hhr" event={"ID":"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b","Type":"ContainerStarted","Data":"95791eb4c2f81333ae250b452900acdd163faaf3895ffa32be946515ed757941"} Apr 23 16:35:26.505831 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.505752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" event={"ID":"a633b951-4a93-4560-a71d-1bc980e75563","Type":"ContainerStarted","Data":"023a166c76415c9f46e66066d9d3599ada9d1d3d728190b50231bd3e12e4725c"} Apr 23 16:35:26.506217 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.505900 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:26.520524 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.520488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" Apr 23 16:35:26.524294 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.524238 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-m9crc" podStartSLOduration=1.800377192 podStartE2EDuration="4.524222778s" podCreationTimestamp="2026-04-23 16:35:22 +0000 UTC" firstStartedPulling="2026-04-23 16:35:23.459400798 +0000 UTC m=+77.031913697" lastFinishedPulling="2026-04-23 16:35:26.183246372 +0000 UTC m=+79.755759283" observedRunningTime="2026-04-23 16:35:26.52373221 +0000 UTC m=+80.096245122" watchObservedRunningTime="2026-04-23 16:35:26.524222778 +0000 UTC m=+80.096735697" Apr 23 16:35:26.700195 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.700163 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fz96r"] Apr 23 16:35:26.703124 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.703107 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.706223 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.706201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 16:35:26.706321 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.706201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-fp2m4\"" Apr 23 16:35:26.706415 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.706396 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 16:35:26.706471 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.706436 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:35:26.711897 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.711873 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fz96r"] Apr 23 16:35:26.741872 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.741829 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.742059 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.741882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84fbs\" (UniqueName: \"kubernetes.io/projected/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-kube-api-access-84fbs\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.742059 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.741967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.742059 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.742028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.843412 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.843335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.843412 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.843400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.843562 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.843421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84fbs\" (UniqueName: \"kubernetes.io/projected/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-kube-api-access-84fbs\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.843562 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.843447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.844160 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.844123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.845899 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.845882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.845997 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.845978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:26.852437 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:26.852415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84fbs\" (UniqueName: \"kubernetes.io/projected/fb4a1942-d3dc-4d66-8952-3ba99aec8a05-kube-api-access-84fbs\") pod \"prometheus-operator-5676c8c784-fz96r\" (UID: \"fb4a1942-d3dc-4d66-8952-3ba99aec8a05\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:27.013152 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:27.013124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" Apr 23 16:35:27.143210 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:27.143078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fz96r"] Apr 23 16:35:27.146716 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:27.146689 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4a1942_d3dc_4d66_8952_3ba99aec8a05.slice/crio-7ad65e0a33f43e6ba14b04e8f905fe4bfb48ca19d7be97037f94299fddf7d13d WatchSource:0}: Error finding container 7ad65e0a33f43e6ba14b04e8f905fe4bfb48ca19d7be97037f94299fddf7d13d: Status 404 returned error can't find the container with id 7ad65e0a33f43e6ba14b04e8f905fe4bfb48ca19d7be97037f94299fddf7d13d Apr 23 16:35:27.510083 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:27.510046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" event={"ID":"fb4a1942-d3dc-4d66-8952-3ba99aec8a05","Type":"ContainerStarted","Data":"7ad65e0a33f43e6ba14b04e8f905fe4bfb48ca19d7be97037f94299fddf7d13d"} Apr 23 16:35:27.512009 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:27.511980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2hhr" event={"ID":"a0a3f4ea-23a5-4bdf-b548-d2b2314b583b","Type":"ContainerStarted","Data":"fc667e5ea92de83d0132c517538b45ac290c532e0d79e936091882cf16c99420"} Apr 23 16:35:27.533972 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:27.533907 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c2hhr" podStartSLOduration=2.444224036 podStartE2EDuration="5.533892616s" podCreationTimestamp="2026-04-23 16:35:22 +0000 UTC" firstStartedPulling="2026-04-23 16:35:23.491125738 +0000 UTC m=+77.063638642" lastFinishedPulling="2026-04-23 16:35:26.580794311 +0000 UTC m=+80.153307222" observedRunningTime="2026-04-23 16:35:27.531856111 +0000 UTC m=+81.104369032" watchObservedRunningTime="2026-04-23 16:35:27.533892616 +0000 UTC m=+81.106405552" Apr 23 16:35:28.377836 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:28.377808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d7t2c" Apr 23 16:35:28.477581 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:28.477555 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g9pgx" Apr 23 16:35:29.520149 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:29.520109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" event={"ID":"fb4a1942-d3dc-4d66-8952-3ba99aec8a05","Type":"ContainerStarted","Data":"7c4f45dec30d5eda909d371f7a0a7989eed4dc1d08831babf5a0a1b2b5fb4658"} Apr 23 16:35:29.520149 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:29.520155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" event={"ID":"fb4a1942-d3dc-4d66-8952-3ba99aec8a05","Type":"ContainerStarted","Data":"9a0d885d74d137cddb0f1f8611029447df420745d37a6b8d8c7df2511615f1b2"} Apr 23 16:35:29.543047 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:29.542988 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-fz96r" podStartSLOduration=1.8563207080000002 podStartE2EDuration="3.542974783s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="2026-04-23 16:35:27.148567095 +0000 UTC m=+80.721079995" lastFinishedPulling="2026-04-23 16:35:28.835221048 +0000 UTC m=+82.407734070" observedRunningTime="2026-04-23 16:35:29.542688199 +0000 UTC m=+83.115201119" watchObservedRunningTime="2026-04-23 16:35:29.542974783 +0000 UTC m=+83.115487704" Apr 23 16:35:31.145838 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.145797 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bzw7l"] Apr 23 16:35:31.150038 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.150019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.153153 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.153126 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:35:31.153691 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.153671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:35:31.154597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.154575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g4jgc\"" Apr 23 16:35:31.154702 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.154580 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:35:31.181471 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181605 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-wtmp\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181605 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-accelerators-collector-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181711 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-metrics-client-ca\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181711 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-sys\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181711 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-textfile\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181860 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181860 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-root\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.181860 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.181780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75njw\" (UniqueName: \"kubernetes.io/projected/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-kube-api-access-75njw\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283193 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-root\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283193 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75njw\" (UniqueName: \"kubernetes.io/projected/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-kube-api-access-75njw\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-wtmp\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-root\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-accelerators-collector-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283436 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-metrics-client-ca\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-sys\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-textfile\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-wtmp\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:31.283589 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-sys\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.283693 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:31.283644 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls podName:e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.783624216 +0000 UTC m=+85.356137121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls") pod "node-exporter-bzw7l" (UID: "e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14") : secret "node-exporter-tls" not found Apr 23 16:35:31.284065 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.283964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-textfile\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.284310 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.284288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-accelerators-collector-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.284405 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.284348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-metrics-client-ca\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.286427 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.286406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.292394 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.292344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75njw\" (UniqueName: \"kubernetes.io/projected/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-kube-api-access-75njw\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.787643 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.787612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.789909 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.789889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14-node-exporter-tls\") pod \"node-exporter-bzw7l\" (UID: \"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14\") " pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:31.918344 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.918308 2572 patch_prober.go:28] interesting pod/image-registry-5d445c8494-5dps7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:35:31.918490 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:31.918357 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:35:32.061733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:32.061662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bzw7l" Apr 23 16:35:32.072616 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:35:32.072588 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fe7dc7_00ed_498f_ac8f_8d5b49bf5f14.slice/crio-7f48ca6f67b1de7ea25794ef67ad85d98ea582fd0d8e9b8935b8a7eeff598a88 WatchSource:0}: Error finding container 7f48ca6f67b1de7ea25794ef67ad85d98ea582fd0d8e9b8935b8a7eeff598a88: Status 404 returned error can't find the container with id 7f48ca6f67b1de7ea25794ef67ad85d98ea582fd0d8e9b8935b8a7eeff598a88 Apr 23 16:35:32.532422 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:32.532341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bzw7l" event={"ID":"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14","Type":"ContainerStarted","Data":"7f48ca6f67b1de7ea25794ef67ad85d98ea582fd0d8e9b8935b8a7eeff598a88"} Apr 23 16:35:32.893834 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:32.893804 2572 patch_prober.go:28] interesting pod/image-registry-57798c5bd5-257tb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:35:32.893954 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:32.893851 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:35:33.026957 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.026910 2572 scope.go:117] "RemoveContainer" containerID="a22aaf7ac455b388f4e9526067fafbda4a3623627b8864c80ffaefe8aab94256" Apr 23 16:35:33.452645 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.452615 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:35:33.537166 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.537133 2572 generic.go:358] "Generic (PLEG): container finished" podID="e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14" containerID="fbcb717ef05d6e13875265b8ca32b8aac87d6723ec00eb3c8a74fde9c006f2f9" exitCode=0 Apr 23 16:35:33.537554 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.537224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bzw7l" event={"ID":"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14","Type":"ContainerDied","Data":"fbcb717ef05d6e13875265b8ca32b8aac87d6723ec00eb3c8a74fde9c006f2f9"} Apr 23 16:35:33.539047 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.539029 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:35:33.539144 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.539133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" event={"ID":"d2744f4f-889f-4833-82ac-129e28488162","Type":"ContainerStarted","Data":"0e8dc2a0cbfa26d037b3b87e9b3efab414674f7435bd67afcc61dd6289e19158"} Apr 23 16:35:33.539411 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.539392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:33.580635 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:33.580594 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" podStartSLOduration=71.936185665 podStartE2EDuration="1m26.580581791s" podCreationTimestamp="2026-04-23 16:34:07 +0000 UTC" firstStartedPulling="2026-04-23 16:34:40.429598777 +0000 UTC m=+34.002111687" lastFinishedPulling="2026-04-23 16:34:55.073994912 +0000 UTC m=+48.646507813" observedRunningTime="2026-04-23 16:35:33.579534565 +0000 UTC m=+87.152047488" watchObservedRunningTime="2026-04-23 16:35:33.580581791 +0000 UTC m=+87.153094712" Apr 23 16:35:34.239546 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:34.239514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bwh5t" Apr 23 16:35:34.544320 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:34.544228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bzw7l" event={"ID":"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14","Type":"ContainerStarted","Data":"842856d1672a420ea2ae6f9fb94e19380e037f6c762ed1d51faab47608ce7253"} Apr 23 16:35:34.544320 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:34.544276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bzw7l" event={"ID":"e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14","Type":"ContainerStarted","Data":"009d7c270685cdda2831d53d242ec31b66066668a12ef21122b3cfa9d9c67c57"} Apr 23 16:35:34.570060 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:34.570010 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bzw7l" podStartSLOduration=2.872034882 podStartE2EDuration="3.569997612s" podCreationTimestamp="2026-04-23 16:35:31 +0000 UTC" firstStartedPulling="2026-04-23 16:35:32.074682704 +0000 UTC m=+85.647195604" lastFinishedPulling="2026-04-23 16:35:32.77264543 +0000 UTC m=+86.345158334" observedRunningTime="2026-04-23 16:35:34.568726696 +0000 UTC m=+88.141239617" watchObservedRunningTime="2026-04-23 16:35:34.569997612 +0000 UTC m=+88.142510530" Apr 23 16:35:42.892801 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:42.892772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:47.907848 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:47.907778 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerName="registry" containerID="cri-o://ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81" gracePeriod=30 Apr 23 16:35:48.149663 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.149637 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:48.230873 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230885 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230944 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230966 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.230994 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231042 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.231017 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231324 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.231051 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smh2t\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t\") pod \"5460e905-1b76-4876-8552-7f9866fe1dc0\" (UID: \"5460e905-1b76-4876-8552-7f9866fe1dc0\") " Apr 23 16:35:48.231615 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.231449 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:35:48.231615 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.231489 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:35:48.233662 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.233631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:35:48.233803 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.233780 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:35:48.233873 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.233836 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t" (OuterVolumeSpecName: "kube-api-access-smh2t") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "kube-api-access-smh2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:35:48.233954 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.233890 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:35:48.234106 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.234084 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:35:48.239887 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.239863 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5460e905-1b76-4876-8552-7f9866fe1dc0" (UID: "5460e905-1b76-4876-8552-7f9866fe1dc0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:35:48.332463 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332438 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-certificates\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332463 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332462 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-registry-tls\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332471 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5460e905-1b76-4876-8552-7f9866fe1dc0-trusted-ca\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332481 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-installation-pull-secrets\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332490 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smh2t\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-kube-api-access-smh2t\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332499 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5460e905-1b76-4876-8552-7f9866fe1dc0-ca-trust-extracted\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332509 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5460e905-1b76-4876-8552-7f9866fe1dc0-image-registry-private-configuration\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.332597 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.332518 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5460e905-1b76-4876-8552-7f9866fe1dc0-bound-sa-token\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:35:48.585203 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.585122 2572 generic.go:358] "Generic (PLEG): container finished" podID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerID="ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81" exitCode=0 Apr 23 16:35:48.585203 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.585166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" event={"ID":"5460e905-1b76-4876-8552-7f9866fe1dc0","Type":"ContainerDied","Data":"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81"} Apr 23 16:35:48.585203 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.585187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" event={"ID":"5460e905-1b76-4876-8552-7f9866fe1dc0","Type":"ContainerDied","Data":"976c9181c969263244013b27231e1b3fcd47b646bbd369940c8eee21d9f7539b"} Apr 23 16:35:48.585203 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.585202 2572 scope.go:117] "RemoveContainer" containerID="ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81" Apr 23 16:35:48.585473 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.585209 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57798c5bd5-257tb" Apr 23 16:35:48.593804 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.593788 2572 scope.go:117] "RemoveContainer" containerID="ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81" Apr 23 16:35:48.594093 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:35:48.594073 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81\": container with ID starting with ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81 not found: ID does not exist" containerID="ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81" Apr 23 16:35:48.594150 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.594102 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81"} err="failed to get container status \"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81\": rpc error: code = NotFound desc = could not find container \"ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81\": container with ID starting with ffe37736cf3d884e8a4cd021ad6ac210401d6df916ec2625779f47e0033a1d81 not found: ID does not exist" Apr 23 16:35:48.610038 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.610016 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:35:48.614767 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:48.614747 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57798c5bd5-257tb"] Apr 23 16:35:49.030155 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:49.030111 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" path="/var/lib/kubelet/pods/5460e905-1b76-4876-8552-7f9866fe1dc0/volumes" Apr 23 16:35:50.105743 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:35:50.105712 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:36:15.124520 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.124462 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerName="registry" containerID="cri-o://42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689" gracePeriod=30 Apr 23 16:36:15.362694 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.362672 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:36:15.453999 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.453969 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454146 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454012 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454146 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454051 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc2b7\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454146 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454087 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454146 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454110 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454152 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454183 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454259 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca\") pod \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\" (UID: \"a1ccc661-cbd0-49b7-b172-5749b2c3e73f\") " Apr 23 16:36:15.454560 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454528 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:15.454819 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.454781 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:15.456931 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.456887 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:15.456931 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.456890 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:15.457096 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.456941 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:15.457096 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.457026 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:15.457096 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.457070 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7" (OuterVolumeSpecName: "kube-api-access-fc2b7") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "kube-api-access-fc2b7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:15.462779 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.462758 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a1ccc661-cbd0-49b7-b172-5749b2c3e73f" (UID: "a1ccc661-cbd0-49b7-b172-5749b2c3e73f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:15.555350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555316 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-bound-sa-token\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555346 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-installation-pull-secrets\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555350 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555356 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-trusted-ca\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555368 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-image-registry-private-configuration\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555377 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-ca-trust-extracted\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555386 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fc2b7\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-kube-api-access-fc2b7\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555397 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-certificates\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.555564 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.555405 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a1ccc661-cbd0-49b7-b172-5749b2c3e73f-registry-tls\") on node \"ip-10-0-133-231.ec2.internal\" DevicePath \"\"" Apr 23 16:36:15.665030 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.664980 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerID="42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689" exitCode=0 Apr 23 16:36:15.665207 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.665057 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" Apr 23 16:36:15.665207 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.665070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" event={"ID":"a1ccc661-cbd0-49b7-b172-5749b2c3e73f","Type":"ContainerDied","Data":"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689"} Apr 23 16:36:15.665207 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.665112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d445c8494-5dps7" event={"ID":"a1ccc661-cbd0-49b7-b172-5749b2c3e73f","Type":"ContainerDied","Data":"92982dca6cfa6b75c69304d1a76c75d22fd64be8b06ce3723c4e5c520fca08f6"} Apr 23 16:36:15.665207 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.665135 2572 scope.go:117] "RemoveContainer" containerID="42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689" Apr 23 16:36:15.673700 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.673683 2572 scope.go:117] "RemoveContainer" containerID="42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689" Apr 23 16:36:15.673982 ip-10-0-133-231 kubenswrapper[2572]: E0423 16:36:15.673962 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689\": container with ID starting with 42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689 not found: ID does not exist" containerID="42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689" Apr 23 16:36:15.674061 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.673989 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689"} err="failed to get container status \"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689\": rpc error: code = NotFound desc = could not find container \"42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689\": container with ID starting with 42fd752046f9e980d03d4c27409316a75351d61dd7c0862958ef015d897b4689 not found: ID does not exist" Apr 23 16:36:15.687420 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.687393 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:36:15.691957 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:15.691936 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d445c8494-5dps7"] Apr 23 16:36:17.030885 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:17.030852 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" path="/var/lib/kubelet/pods/a1ccc661-cbd0-49b7-b172-5749b2c3e73f/volumes" Apr 23 16:36:26.699428 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.699391 2572 generic.go:358] "Generic (PLEG): container finished" podID="dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799" containerID="dc5ceee50bcd650226920117b0bf19414c77e9877afb928aaf9d3cba6de98523" exitCode=0 Apr 23 16:36:26.699876 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.699461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" event={"ID":"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799","Type":"ContainerDied","Data":"dc5ceee50bcd650226920117b0bf19414c77e9877afb928aaf9d3cba6de98523"} Apr 23 16:36:26.699876 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.699856 2572 scope.go:117] "RemoveContainer" containerID="dc5ceee50bcd650226920117b0bf19414c77e9877afb928aaf9d3cba6de98523" Apr 23 16:36:26.700836 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.700817 2572 generic.go:358] "Generic (PLEG): container finished" podID="970788cb-b97f-467f-bd8e-69787c8efef5" containerID="4c082f7cc93d0bb321216db2ad7d48bd5d76524e1c8bce9d88b1ea69d325cfcc" exitCode=0 Apr 23 16:36:26.700905 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.700887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" event={"ID":"970788cb-b97f-467f-bd8e-69787c8efef5","Type":"ContainerDied","Data":"4c082f7cc93d0bb321216db2ad7d48bd5d76524e1c8bce9d88b1ea69d325cfcc"} Apr 23 16:36:26.701164 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:26.701151 2572 scope.go:117] "RemoveContainer" containerID="4c082f7cc93d0bb321216db2ad7d48bd5d76524e1c8bce9d88b1ea69d325cfcc" Apr 23 16:36:27.705341 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:27.705305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5xt7q" event={"ID":"dfdfff4d-8c20-4a8b-b4e9-4c09e8f17799","Type":"ContainerStarted","Data":"15ee8a4f6398a4cc70d3634d4905a9aaab392d4f4529d84510eb95876516c436"} Apr 23 16:36:27.707341 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:27.707306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wg4fk" event={"ID":"970788cb-b97f-467f-bd8e-69787c8efef5","Type":"ContainerStarted","Data":"5a5747fcb45a38fc6241237af5e66042c49981e6cee13f43f6632820d10164f0"} Apr 23 16:36:27.709080 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:27.709052 2572 generic.go:358] "Generic (PLEG): container finished" podID="af33177a-4a99-4b44-8427-ff5e05da026f" containerID="69efd057816db4b47e3c09191a885973652d7595c30c2b88c3b3ff82c88ed343" exitCode=0 Apr 23 16:36:27.709192 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:27.709121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lfp22" event={"ID":"af33177a-4a99-4b44-8427-ff5e05da026f","Type":"ContainerDied","Data":"69efd057816db4b47e3c09191a885973652d7595c30c2b88c3b3ff82c88ed343"} Apr 23 16:36:27.709462 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:27.709447 2572 scope.go:117] "RemoveContainer" containerID="69efd057816db4b47e3c09191a885973652d7595c30c2b88c3b3ff82c88ed343" Apr 23 16:36:28.713435 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:28.713400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lfp22" event={"ID":"af33177a-4a99-4b44-8427-ff5e05da026f","Type":"ContainerStarted","Data":"b4b0e5fde8782243a66f1e79a4a482d250ec2a67369d4f243b9d34a638f85421"} Apr 23 16:36:30.083467 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:30.083430 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" podUID="0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:36:40.093275 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:40.093213 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" podUID="0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:36:50.083695 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.083655 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" podUID="0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:36:50.084160 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.083723 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" Apr 23 16:36:50.084206 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.084168 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ad9ea5424d52a43407b79375ea9a59e709006fe2443d7c7cab48c9fb2f4ffcce"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 16:36:50.084241 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.084205 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" podUID="0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d" containerName="service-proxy" containerID="cri-o://ad9ea5424d52a43407b79375ea9a59e709006fe2443d7c7cab48c9fb2f4ffcce" gracePeriod=30 Apr 23 16:36:50.788020 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.787986 2572 generic.go:358] "Generic (PLEG): container finished" podID="0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d" containerID="ad9ea5424d52a43407b79375ea9a59e709006fe2443d7c7cab48c9fb2f4ffcce" exitCode=2 Apr 23 16:36:50.788208 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.788063 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerDied","Data":"ad9ea5424d52a43407b79375ea9a59e709006fe2443d7c7cab48c9fb2f4ffcce"} Apr 23 16:36:50.788208 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:36:50.788099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bc4dc97b5-ngf7z" event={"ID":"0700ea3b-29ca-4ecf-b4fc-6b870dc6bd5d","Type":"ContainerStarted","Data":"e4b44fc55241fdaefb6a5d48d7b67293317c4b1bdaa3def53228ecb7265c0e71"} Apr 23 16:39:06.931550 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:39:06.931518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:39:06.933303 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:39:06.933280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:39:06.941569 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:39:06.941531 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:44:06.954618 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:44:06.954587 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:44:06.955140 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:44:06.954814 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:49:06.977583 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:49:06.977511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:49:06.978497 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:49:06.978467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:54:04.794771 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:04.794741 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-p5ms5_9ea279d1-4c47-464b-9c84-1868a227c6b6/global-pull-secret-syncer/0.log" Apr 23 16:54:04.911566 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:04.911517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jn7xj_613078ff-a7b4-43be-9362-ff4e9be86af1/konnectivity-agent/0.log" Apr 23 16:54:04.960682 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:04.960654 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-231.ec2.internal_1eb73e8eaa1b833503296b19a264c17c/haproxy/0.log" Apr 23 16:54:06.998948 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:06.998906 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:54:07.000878 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:07.000854 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:54:08.392347 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:08.392321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-d4nsk_6e3b3b5e-7dd7-421c-9733-f304050ddbce/cluster-monitoring-operator/0.log" Apr 23 16:54:08.549507 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:08.549425 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bzw7l_e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14/node-exporter/0.log" Apr 23 16:54:08.585907 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:08.585885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bzw7l_e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14/kube-rbac-proxy/0.log" Apr 23 16:54:08.636628 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:08.636602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bzw7l_e0fe7dc7-00ed-498f-ac8f-8d5b49bf5f14/init-textfile/0.log" Apr 23 16:54:09.085211 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:09.085175 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fz96r_fb4a1942-d3dc-4d66-8952-3ba99aec8a05/prometheus-operator/0.log" Apr 23 16:54:09.107793 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:09.107764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fz96r_fb4a1942-d3dc-4d66-8952-3ba99aec8a05/kube-rbac-proxy/0.log" Apr 23 16:54:09.137095 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:09.137073 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-m9crc_a633b951-4a93-4560-a71d-1bc980e75563/prometheus-operator-admission-webhook/0.log" Apr 23 16:54:10.489031 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:10.488971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6pbrl_860ae3e8-8bc5-4280-aefd-e5190c5e1db8/networking-console-plugin/0.log" Apr 23 16:54:10.931003 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:10.930954 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/2.log" Apr 23 16:54:10.934941 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:10.934908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bwh5t_d2744f4f-889f-4833-82ac-129e28488162/console-operator/3.log" Apr 23 16:54:11.690799 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.690770 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nqvgv_cef5ddfd-948f-4294-a2d2-9123e23feea6/volume-data-source-validator/0.log" Apr 23 16:54:11.903696 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.903666 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd"] Apr 23 16:54:11.903981 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.903968 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerName="registry" Apr 23 16:54:11.904030 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.903982 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerName="registry" Apr 23 16:54:11.904030 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.903997 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerName="registry" Apr 23 16:54:11.904030 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.904003 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerName="registry" Apr 23 16:54:11.904120 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.904061 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5460e905-1b76-4876-8552-7f9866fe1dc0" containerName="registry" Apr 23 16:54:11.904120 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.904072 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ccc661-cbd0-49b7-b172-5749b2c3e73f" containerName="registry" Apr 23 16:54:11.907254 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.907235 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:11.909697 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.909674 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"openshift-service-ca.crt\"" Apr 23 16:54:11.910733 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.910718 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gwvnc\"/\"default-dockercfg-kgzpm\"" Apr 23 16:54:11.910795 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.910748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"kube-root-ca.crt\"" Apr 23 16:54:11.913402 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:11.913379 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd"] Apr 23 16:54:12.023604 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.023523 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4bcr\" (UniqueName: \"kubernetes.io/projected/3002bc6d-a283-41cf-a543-57889b0b9d4d-kube-api-access-w4bcr\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.023604 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.023567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-sys\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.023604 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.023584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-proc\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.023806 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.023653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-lib-modules\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.023806 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.023719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-podres\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124491 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-podres\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124638 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4bcr\" (UniqueName: \"kubernetes.io/projected/3002bc6d-a283-41cf-a543-57889b0b9d4d-kube-api-access-w4bcr\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124638 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-sys\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124638 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-proc\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124638 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-lib-modules\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124831 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-sys\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124831 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-proc\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124831 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-lib-modules\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.124831 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.124646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3002bc6d-a283-41cf-a543-57889b0b9d4d-podres\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.132143 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.132119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4bcr\" (UniqueName: \"kubernetes.io/projected/3002bc6d-a283-41cf-a543-57889b0b9d4d-kube-api-access-w4bcr\") pod \"perf-node-gather-daemonset-dwtnd\" (UID: \"3002bc6d-a283-41cf-a543-57889b0b9d4d\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.218531 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.218505 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.338828 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.338791 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd"] Apr 23 16:54:12.343744 ip-10-0-133-231 kubenswrapper[2572]: W0423 16:54:12.343719 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3002bc6d_a283_41cf_a543_57889b0b9d4d.slice/crio-9ff0b65594e932ff12064fd997aad86d0ab4be10f02de447f9c7efdf4cb9693a WatchSource:0}: Error finding container 9ff0b65594e932ff12064fd997aad86d0ab4be10f02de447f9c7efdf4cb9693a: Status 404 returned error can't find the container with id 9ff0b65594e932ff12064fd997aad86d0ab4be10f02de447f9c7efdf4cb9693a Apr 23 16:54:12.345223 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.345205 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:54:12.432798 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.432776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g9pgx_d997e103-1c8d-4bb4-a579-2d6b344c089f/dns/0.log" Apr 23 16:54:12.457731 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.457715 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g9pgx_d997e103-1c8d-4bb4-a579-2d6b344c089f/kube-rbac-proxy/0.log" Apr 23 16:54:12.578543 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.578481 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mqskj_7789b2de-75cc-4057-8981-8850b48ac765/dns-node-resolver/0.log" Apr 23 16:54:12.797313 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.797275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" event={"ID":"3002bc6d-a283-41cf-a543-57889b0b9d4d","Type":"ContainerStarted","Data":"797354d06b8216fa7444a48433df076312acb95dfdfa4e06c211ba7e098b7656"} Apr 23 16:54:12.797313 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.797317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" event={"ID":"3002bc6d-a283-41cf-a543-57889b0b9d4d","Type":"ContainerStarted","Data":"9ff0b65594e932ff12064fd997aad86d0ab4be10f02de447f9c7efdf4cb9693a"} Apr 23 16:54:12.797694 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.797418 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:12.816260 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.816213 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" podStartSLOduration=1.816201169 podStartE2EDuration="1.816201169s" podCreationTimestamp="2026-04-23 16:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:12.81461254 +0000 UTC m=+1206.387125476" watchObservedRunningTime="2026-04-23 16:54:12.816201169 +0000 UTC m=+1206.388714147" Apr 23 16:54:12.987083 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:12.987058 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dp4mb_10bc9a50-0524-4e20-a0cb-cfbcb6a0f1fa/node-ca/0.log" Apr 23 16:54:13.730264 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:13.730235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-77d69779b-8754l_fa0af365-ad6e-4695-bd7d-c6838cbcf027/router/0.log" Apr 23 16:54:14.109134 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.109061 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hsn4f_b4778171-44e0-4227-8cef-29899b536604/serve-healthcheck-canary/0.log" Apr 23 16:54:14.457827 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.457799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lfp22_af33177a-4a99-4b44-8427-ff5e05da026f/insights-operator/0.log" Apr 23 16:54:14.458019 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.457935 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lfp22_af33177a-4a99-4b44-8427-ff5e05da026f/insights-operator/1.log" Apr 23 16:54:14.543072 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.543043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2hhr_a0a3f4ea-23a5-4bdf-b548-d2b2314b583b/kube-rbac-proxy/0.log" Apr 23 16:54:14.566054 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.566034 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2hhr_a0a3f4ea-23a5-4bdf-b548-d2b2314b583b/exporter/0.log" Apr 23 16:54:14.602755 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:14.602733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2hhr_a0a3f4ea-23a5-4bdf-b548-d2b2314b583b/extractor/0.log" Apr 23 16:54:18.809895 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:18.809869 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwtnd" Apr 23 16:54:20.377280 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:20.377239 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-g4ksv_6fd373ce-6bf6-480e-b726-08169e5b5b2f/migrator/0.log" Apr 23 16:54:20.399235 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:20.399205 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-g4ksv_6fd373ce-6bf6-480e-b726-08169e5b5b2f/graceful-termination/0.log" Apr 23 16:54:20.744971 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:20.744937 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wg4fk_970788cb-b97f-467f-bd8e-69787c8efef5/kube-storage-version-migrator-operator/1.log" Apr 23 16:54:20.745616 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:20.745597 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wg4fk_970788cb-b97f-467f-bd8e-69787c8efef5/kube-storage-version-migrator-operator/0.log" Apr 23 16:54:22.100239 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.100208 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/kube-multus-additional-cni-plugins/0.log" Apr 23 16:54:22.121727 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.121703 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/egress-router-binary-copy/0.log" Apr 23 16:54:22.142890 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.142871 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/cni-plugins/0.log" Apr 23 16:54:22.165424 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.165407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/bond-cni-plugin/0.log" Apr 23 16:54:22.186813 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.186790 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/routeoverride-cni/0.log" Apr 23 16:54:22.208848 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.208832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/whereabouts-cni-bincopy/0.log" Apr 23 16:54:22.230897 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.230874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r652k_f24243b7-5732-41e7-a97d-ff3ef6a751d0/whereabouts-cni/0.log" Apr 23 16:54:22.282617 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.282594 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wjwgw_bf96e64c-13d1-4533-b0fb-a69566795f63/kube-multus/0.log" Apr 23 16:54:22.345142 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.345112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-glcj7_eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef/network-metrics-daemon/0.log" Apr 23 16:54:22.365702 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:22.365640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-glcj7_eebbda0b-dcd9-4f01-af8f-d107d7a5e1ef/kube-rbac-proxy/0.log" Apr 23 16:54:23.200010 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.199981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/ovn-controller/0.log" Apr 23 16:54:23.230094 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.230070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/ovn-acl-logging/0.log" Apr 23 16:54:23.252533 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.252499 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/kube-rbac-proxy-node/0.log" Apr 23 16:54:23.276263 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.276242 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 16:54:23.296881 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.296810 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/northd/0.log" Apr 23 16:54:23.319996 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.319971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/nbdb/0.log" Apr 23 16:54:23.343536 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.343510 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/sbdb/0.log" Apr 23 16:54:23.429734 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:23.429707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95t7c_4b2c8879-054c-4712-b5f0-7d3038cf3e84/ovnkube-controller/0.log" Apr 23 16:54:24.925095 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:24.925058 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-7rh2n_2b95d992-bc38-4499-9a6e-ff4e0571a154/check-endpoints/0.log" Apr 23 16:54:24.996321 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:24.996278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d7t2c_1fcbd9bc-88ba-48d1-978b-f8e2585ab84c/network-check-target-container/0.log" Apr 23 16:54:25.825510 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:25.825480 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cx8tc_0cb82eee-8d58-46f1-8148-5a83f7d6a3a1/iptables-alerter/0.log" Apr 23 16:54:26.462041 ip-10-0-133-231 kubenswrapper[2572]: I0423 16:54:26.461996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-sjmb9_821ae57d-81ac-4242-a0c9-51cdf1716096/tuned/0.log"